Mar 13 13:56:03 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 13:56:03 crc restorecon[4696]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:03 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 13:56:04 crc restorecon[4696]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 13:56:05 crc kubenswrapper[4898]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 13:56:05 crc kubenswrapper[4898]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 13:56:05 crc kubenswrapper[4898]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 13:56:05 crc kubenswrapper[4898]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 13:56:05 crc kubenswrapper[4898]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 13:56:05 crc kubenswrapper[4898]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.481424 4898 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492803 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492845 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492856 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492865 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492874 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492883 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492892 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492929 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492938 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492947 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492955 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492964 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492972 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492982 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492990 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.492999 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493007 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493015 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493026 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493039 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493050 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493060 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493069 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493078 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493087 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493095 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493104 4898 feature_gate.go:330] unrecognized feature gate: Example Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493113 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493121 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493129 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493138 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493148 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493158 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493166 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493175 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493183 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493191 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493200 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493209 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493218 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493227 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493236 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493245 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493254 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493262 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493270 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493281 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493289 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493298 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493306 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493314 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493323 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493331 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493339 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493348 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493359 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493369 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493378 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493387 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493396 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493405 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493413 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493424 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493433 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493441 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493449 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493457 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493466 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493474 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493486 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.493497 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493654 4898 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493673 4898 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493688 4898 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493701 4898 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493713 4898 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493723 4898 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493736 4898 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493747 4898 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493758 4898 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493768 4898 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493778 4898 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493788 4898 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493798 4898 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493809 4898 flags.go:64] FLAG: --cgroup-root="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493818 4898 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493828 4898 flags.go:64] FLAG: --client-ca-file="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493838 4898 flags.go:64] FLAG: --cloud-config="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493847 4898 flags.go:64] FLAG: --cloud-provider="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493857 4898 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493869 4898 flags.go:64] FLAG: --cluster-domain="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493885 4898 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493929 4898 flags.go:64] FLAG: --config-dir="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493943 4898 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493956 4898 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493973 4898 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493984 4898 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.493998 4898 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494011 4898 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494023 4898 flags.go:64] FLAG: --contention-profiling="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494035 4898 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494045 4898 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494057 4898 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494068 4898 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494080 4898 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494090 4898 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494100 4898 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494109 4898 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494119 4898 flags.go:64] FLAG: --enable-server="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494129 4898 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494141 4898 flags.go:64] FLAG: --event-burst="100" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494151 4898 flags.go:64] FLAG: --event-qps="50" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494161 4898 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494171 4898 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494181 4898 flags.go:64] FLAG: --eviction-hard="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494193 4898 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494204 4898 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494214 4898 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494224 4898 flags.go:64] FLAG: --eviction-soft="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494234 4898 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494243 4898 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494253 4898 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494263 4898 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494273 4898 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494285 4898 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494297 4898 flags.go:64] FLAG: --feature-gates="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494312 4898 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494324 4898 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494337 4898 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494350 4898 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494363 4898 flags.go:64] FLAG: --healthz-port="10248" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494377 4898 flags.go:64] FLAG: --help="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494389 4898 flags.go:64] FLAG: --hostname-override="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494400 4898 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494415 4898 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494428 4898 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494441 4898 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494453 4898 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494465 4898 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494479 4898 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494491 4898 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494503 4898 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494515 4898 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494529 4898 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494541 4898 flags.go:64] FLAG: --kube-reserved="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494555 4898 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494567 4898 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494580 4898 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494592 4898 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494606 4898 flags.go:64] FLAG: --lock-file="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494618 4898 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494630 4898 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494643 4898 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494662 4898 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494675 4898 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494687 4898 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494700 4898 flags.go:64] FLAG: --logging-format="text" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494713 4898 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494726 4898 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494739 4898 flags.go:64] FLAG: --manifest-url="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494750 4898 flags.go:64] FLAG: --manifest-url-header="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494767 4898 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494780 4898 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494796 4898 flags.go:64] FLAG: --max-pods="110" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494810 4898 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494822 4898 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494836 4898 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494848 4898 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494862 4898 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494874 4898 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494887 4898 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494953 4898 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494966 4898 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494979 4898 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.494992 4898 flags.go:64] FLAG: --pod-cidr="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495007 4898 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495027 4898 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495040 4898 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495054 4898 flags.go:64] FLAG: --pods-per-core="0" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495066 4898 flags.go:64] FLAG: --port="10250" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495078 4898 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495091 4898 flags.go:64] FLAG: --provider-id="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495103 4898 flags.go:64] FLAG: --qos-reserved="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495115 4898 flags.go:64] FLAG: --read-only-port="10255" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495127 4898 flags.go:64] FLAG: --register-node="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495139 4898 flags.go:64] FLAG: --register-schedulable="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495151 4898 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495171 4898 flags.go:64] FLAG: --registry-burst="10" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495184 4898 flags.go:64] FLAG: --registry-qps="5" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495196 4898 flags.go:64] FLAG: --reserved-cpus="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495208 4898 flags.go:64] FLAG: --reserved-memory="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495223 4898 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495236 4898 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495249 4898 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495261 4898 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495274 4898 flags.go:64] FLAG: --runonce="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495286 4898 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495298 4898 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495310 4898 flags.go:64] FLAG: --seccomp-default="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495323 4898 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495335 4898 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495347 4898 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495359 4898 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495371 4898 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495382 4898 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495394 4898 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495406 4898 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495418 4898 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495439 4898 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495452 4898 flags.go:64] FLAG: --system-cgroups="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495464 4898 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495488 4898 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495501 4898 flags.go:64] FLAG: --tls-cert-file="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495529 4898 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495546 4898 flags.go:64] FLAG: --tls-min-version="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495558 4898 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495570 4898 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495582 4898 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495594 4898 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495607 4898 flags.go:64] FLAG: --v="2" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495622 4898 flags.go:64] FLAG: --version="false" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495637 4898 flags.go:64] FLAG: --vmodule="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495651 4898 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.495664 4898 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496018 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496034 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496048 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496059 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496071 4898 feature_gate.go:330] unrecognized feature gate: Example Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496083 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496096 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496108 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496121 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496133 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496144 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496156 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496167 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496178 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496189 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496200 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496214 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496225 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496237 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496247 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496258 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496269 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496280 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496293 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496304 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496315 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496326 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496337 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496347 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496358 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496369 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496380 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496391 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496402 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496412 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496423 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496433 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496446 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496456 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496468 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496479 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496490 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496526 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496538 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496548 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496563 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496578 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496593 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496608 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496620 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496631 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496642 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496653 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496664 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496675 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496689 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496703 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496715 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496728 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496741 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496752 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496763 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496773 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496785 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496796 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496806 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496820 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496833 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496844 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496855 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.496866 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.497805 4898 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.510419 4898 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.510476 4898 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510596 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510620 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510637 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510646 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510656 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510665 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510673 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510681 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510689 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510700 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510709 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510718 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510727 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510735 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510744 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510752 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510759 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510767 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510774 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510782 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510790 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510798 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510806 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510814 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510821 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510829 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510836 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510844 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510851 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510859 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510868 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510878 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510888 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510928 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510939 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510948 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510956 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510964 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510971 4898 feature_gate.go:330] unrecognized feature gate: Example Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510979 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510987 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.510994 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511002 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511009 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511017 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511025 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511034 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511041 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511049 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511056 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511064 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511072 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511079 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511086 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511094 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511102 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511110 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511118 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511127 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511134 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511142 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511150 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511157 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511165 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511172 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511180 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511191 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511200 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511208 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511216 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511226 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.511240 4898 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511456 4898 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511468 4898 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511477 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511485 4898 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511493 4898 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511501 4898 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511509 4898 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511517 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511524 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511532 4898 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511542 4898 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511553 4898 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511561 4898 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511569 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511578 4898 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511588 4898 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511597 4898 feature_gate.go:330] unrecognized feature gate: Example Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511605 4898 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511614 4898 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511621 4898 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511630 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511637 4898 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511647 4898 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511655 4898 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511663 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511671 4898 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511678 4898 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511686 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511693 4898 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511701 4898 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511709 4898 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511717 4898 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511725 4898 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511733 4898 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511740 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511747 4898 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511755 4898 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511765 4898 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511774 4898 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511782 4898 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511789 4898 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511797 4898 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511805 4898 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511813 4898 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511821 4898 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511829 4898 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511837 4898 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511875 4898 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511886 4898 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511895 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511928 4898 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511938 4898 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511947 4898 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511955 4898 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511965 4898 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511973 4898 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511981 4898 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511989 4898 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.511996 4898 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512004 4898 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512012 4898 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512019 4898 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512027 4898 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512035 4898 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512043 4898 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512053 4898 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512063 4898 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512073 4898 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512082 4898 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512090 4898 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.512099 4898 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.512110 4898 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.513167 4898 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.522110 4898 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.526043 4898 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.526214 4898 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.527983 4898 server.go:997] "Starting client certificate rotation" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.528025 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.528171 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.558562 4898 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.559856 4898 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.565852 4898 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.581517 4898 log.go:25] "Validated CRI v1 runtime API" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.623056 4898 log.go:25] "Validated CRI v1 image API" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.625644 4898 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.634162 4898 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-13-51-18-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.634229 4898 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.648715 4898 manager.go:217] Machine: {Timestamp:2026-03-13 13:56:05.647118369 +0000 UTC m=+0.648706628 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:5908a3c1-cceb-4f4a-af76-6b5ef150f486 BootID:6587b7f7-4682-47cc-be02-888912bc905d Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c9:0c:d3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c9:0c:d3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c0:4e:8d Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:94:39:b6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f4:16:5c Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:53:3c:60 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:82:29:29:fb:94:28 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:76:96:2c:12:42:65 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.648988 4898 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.649122 4898 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.650994 4898 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.651358 4898 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.651466 4898 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.652952 4898 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.652982 4898 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.653692 4898 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.653742 4898 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.654027 4898 state_mem.go:36] "Initialized new in-memory state store" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.654214 4898 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.658235 4898 kubelet.go:418] "Attempting to sync node with API server" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.658269 4898 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.658297 4898 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.658318 4898 kubelet.go:324] "Adding apiserver pod source" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.658337 4898 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.664974 4898 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.666185 4898 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.666456 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.666461 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.666560 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.666572 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.668384 4898 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670275 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670318 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670335 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670350 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670374 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670390 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670404 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670426 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670442 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670455 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670474 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670487 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.670533 4898 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.671206 4898 server.go:1280] "Started kubelet" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.671255 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.671773 4898 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.672143 4898 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.673432 4898 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 13:56:05 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.673469 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.674141 4898 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.674535 4898 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.674559 4898 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.674624 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.675002 4898 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.675676 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.675734 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.676465 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.676762 4898 server.go:460] "Adding debug handlers to kubelet server" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.681826 4898 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.681862 4898 factory.go:55] Registering systemd factory Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.681873 4898 factory.go:221] Registration of the systemd container factory successfully Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.682579 4898 factory.go:153] Registering CRI-O factory Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.682604 4898 factory.go:221] Registration of the crio container factory successfully Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.682689 4898 factory.go:103] Registering Raw factory Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.682718 4898 manager.go:1196] Started watching for new ooms in manager Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.684043 4898 manager.go:319] Starting recovery of all containers Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.683234 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c6b2655e59fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,LastTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.697892 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699072 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699109 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699124 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699138 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699153 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699166 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699181 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699196 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699211 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699223 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699237 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699252 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699269 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699292 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699305 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699317 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699328 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699339 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699351 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699361 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699374 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699384 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699397 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699409 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699420 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699435 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699451 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699465 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699480 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699490 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699503 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699513 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699524 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699535 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.699548 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700206 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700294 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700366 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700460 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700544 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700631 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700695 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700769 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700835 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700911 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.700973 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701045 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701134 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701250 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701339 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701616 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701720 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701818 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.701943 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702036 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702123 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702211 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702298 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702556 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702652 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702745 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702831 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.702961 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703111 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703215 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703302 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703387 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703474 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703557 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703650 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703755 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703844 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.703950 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704038 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704114 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704263 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704332 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704397 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704484 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704570 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704692 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704795 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.704912 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705008 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705094 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705197 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705290 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705377 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705463 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705549 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705641 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705752 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705838 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.705945 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706063 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706155 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706289 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706379 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706475 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706565 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706649 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706738 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.706828 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707037 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707127 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707214 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707294 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707380 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707460 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707552 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707641 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707727 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707811 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707882 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.707993 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708071 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708148 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708243 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708328 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708427 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708506 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708588 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708771 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708858 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.708961 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.709050 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.709144 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.711496 4898 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.711994 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712073 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712157 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712249 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712317 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712405 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712480 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712539 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712596 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712712 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712781 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712839 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712913 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.712975 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713048 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713120 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713182 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713263 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713349 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713464 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713547 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713613 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713674 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713730 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713789 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713845 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713923 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.713999 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714069 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714153 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714224 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714304 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714382 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714473 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714557 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714614 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714670 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714731 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714786 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714841 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714893 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.714981 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715048 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715105 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715160 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715217 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715274 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715340 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715422 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715486 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715555 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715615 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715672 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715734 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715791 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715848 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.715948 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716009 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716075 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716150 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716205 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716263 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716320 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716400 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716478 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716536 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716593 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716668 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716735 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716812 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.716877 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.717913 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.717988 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718049 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718114 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718188 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718247 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718305 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718383 4898 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718465 4898 reconstruct.go:97] "Volume reconstruction finished" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718523 4898 reconciler.go:26] "Reconciler: start to sync state" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.718472 4898 manager.go:324] Recovery completed Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.729715 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.731654 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.731806 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.731870 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.733111 4898 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.733971 4898 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.733999 4898 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.734053 4898 state_mem.go:36] "Initialized new in-memory state store" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.736031 4898 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.738164 4898 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.738245 4898 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.738363 4898 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 13:56:05 crc kubenswrapper[4898]: W0313 13:56:05.739453 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.739557 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.764312 4898 policy_none.go:49] "None policy: Start" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.765223 4898 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.765290 4898 state_mem.go:35] "Initializing new in-memory state store" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.775070 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.830984 4898 manager.go:334] "Starting Device Plugin manager" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.831060 4898 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.831082 4898 server.go:79] "Starting device plugin registration server" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.832367 4898 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.832400 4898 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.832987 4898 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.833108 4898 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.833123 4898 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.838536 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.838625 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.840079 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.841095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.841124 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.841135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.841291 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.841630 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.841694 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.842261 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.842285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.842295 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.842378 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.842945 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.843165 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.843254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.843310 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.843328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.846206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.846263 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.846310 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.846341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.846415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.846457 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.847746 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.847830 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.847866 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.849497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.849543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.849566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.849747 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.850625 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.850689 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.851526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.851567 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.851590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.853478 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.853580 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.853601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.854185 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.854341 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.855755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.855778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.855787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.856202 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.856251 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.856274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.877589 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920564 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920643 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920668 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920696 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920803 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920846 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920883 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920935 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.920968 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.921003 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.921019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.921039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.921059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.921077 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.933758 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.935092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.935144 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.935155 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:05 crc kubenswrapper[4898]: I0313 13:56:05.935192 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:05 crc kubenswrapper[4898]: E0313 13:56:05.935852 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022410 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022487 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022606 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022667 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022689 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022704 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022723 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022742 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022763 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022772 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022827 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022839 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022923 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022888 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022955 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022958 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022999 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022915 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.022972 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023105 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023228 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.023141 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.136979 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.139593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.139663 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.139693 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.139729 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:06 crc kubenswrapper[4898]: E0313 13:56:06.140340 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.189851 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.200861 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.225839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.234242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: W0313 13:56:06.246522 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8540fea4076f92399ba924538c9270241f4161b3b3cb6d389160698c8278be5f WatchSource:0}: Error finding container 8540fea4076f92399ba924538c9270241f4161b3b3cb6d389160698c8278be5f: Status 404 returned error can't find the container with id 8540fea4076f92399ba924538c9270241f4161b3b3cb6d389160698c8278be5f Mar 13 13:56:06 crc kubenswrapper[4898]: W0313 13:56:06.249314 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-124a43814cfcdb0b59eca537fa4355680fd75524e0993ec6a5960951f88bb6c5 WatchSource:0}: Error finding container 124a43814cfcdb0b59eca537fa4355680fd75524e0993ec6a5960951f88bb6c5: Status 404 returned error can't find the container with id 124a43814cfcdb0b59eca537fa4355680fd75524e0993ec6a5960951f88bb6c5 Mar 13 13:56:06 crc kubenswrapper[4898]: W0313 13:56:06.257661 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0d7844a6b7a3615c2123cc4df905b6f8196a185d3bdab39a6eb6450b36e509fd WatchSource:0}: Error finding container 0d7844a6b7a3615c2123cc4df905b6f8196a185d3bdab39a6eb6450b36e509fd: Status 404 returned error can't find the container with id 0d7844a6b7a3615c2123cc4df905b6f8196a185d3bdab39a6eb6450b36e509fd Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.258672 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 13:56:06 crc kubenswrapper[4898]: W0313 13:56:06.260115 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-5a0e2a692f04e11cd1a5843d9be847f0ef5762b8c6e567fd647ff72fd6ce8319 WatchSource:0}: Error finding container 5a0e2a692f04e11cd1a5843d9be847f0ef5762b8c6e567fd647ff72fd6ce8319: Status 404 returned error can't find the container with id 5a0e2a692f04e11cd1a5843d9be847f0ef5762b8c6e567fd647ff72fd6ce8319 Mar 13 13:56:06 crc kubenswrapper[4898]: E0313 13:56:06.278918 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 13 13:56:06 crc kubenswrapper[4898]: W0313 13:56:06.283147 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-754ddec83a61e61535006d4bd4ff96252801ea162d495281089dc31c33cdd1c7 WatchSource:0}: Error finding container 754ddec83a61e61535006d4bd4ff96252801ea162d495281089dc31c33cdd1c7: Status 404 returned error can't find the container with id 754ddec83a61e61535006d4bd4ff96252801ea162d495281089dc31c33cdd1c7 Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.540786 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.542442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.542479 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.542489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.542515 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:06 crc kubenswrapper[4898]: E0313 13:56:06.542885 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.672584 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.745427 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"754ddec83a61e61535006d4bd4ff96252801ea162d495281089dc31c33cdd1c7"} Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.746173 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5a0e2a692f04e11cd1a5843d9be847f0ef5762b8c6e567fd647ff72fd6ce8319"} Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.747919 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0d7844a6b7a3615c2123cc4df905b6f8196a185d3bdab39a6eb6450b36e509fd"} Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.749292 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"124a43814cfcdb0b59eca537fa4355680fd75524e0993ec6a5960951f88bb6c5"} Mar 13 13:56:06 crc kubenswrapper[4898]: I0313 13:56:06.750378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8540fea4076f92399ba924538c9270241f4161b3b3cb6d389160698c8278be5f"} Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.080517 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 13 13:56:07 crc kubenswrapper[4898]: W0313 13:56:07.134570 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.134662 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:07 crc kubenswrapper[4898]: W0313 13:56:07.209941 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.210093 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:07 crc kubenswrapper[4898]: W0313 13:56:07.275300 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.275425 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:07 crc kubenswrapper[4898]: W0313 13:56:07.304275 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.304386 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.343598 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.345287 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.345330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.345341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.345369 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.345806 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.662037 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:56:07 crc kubenswrapper[4898]: E0313 13:56:07.663263 4898 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.201:6443: connect: connection refused" logger="UnhandledError" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.672270 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.755389 4898 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1" exitCode=0 Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.755489 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.755545 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.757063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.757119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.757149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.757150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.757118 4898 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433" exitCode=0 Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.757528 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.758808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.758862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.758881 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.761438 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.761492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.761509 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.761526 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.761527 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.762982 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.763014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.763024 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.764219 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" exitCode=0 Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.764268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.764376 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.765458 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.765517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.765543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.767531 4898 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8" exitCode=0 Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.767620 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8"} Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.767757 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.768314 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.768845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.768951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.768982 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.771821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.771882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:07 crc kubenswrapper[4898]: I0313 13:56:07.771949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.347292 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.673261 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.201:6443: connect: connection refused Mar 13 13:56:08 crc kubenswrapper[4898]: E0313 13:56:08.682472 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.773851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.773909 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.773921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.773923 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.775420 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.775468 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.775486 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.785195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.785225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.785235 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.785245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.788125 4898 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f" exitCode=0 Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.788180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.788261 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.789183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.789213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.789224 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.791495 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.791633 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.791974 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755"} Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.792978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.792992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.793013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.793023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.792998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.793087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.946193 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.947414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.947447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.947457 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:08 crc kubenswrapper[4898]: I0313 13:56:08.947484 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:08 crc kubenswrapper[4898]: E0313 13:56:08.947836 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.201:6443: connect: connection refused" node="crc" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.231324 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.242504 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.798624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a280781bd4308bb8d58704c762393be7b1043a48c9bcf32e578e39be0a70478d"} Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.798717 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.799792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.799832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.799845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.802711 4898 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416" exitCode=0 Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.802820 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.802873 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.802890 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.802875 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.803075 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.802890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416"} Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.805514 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.805564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.805584 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.805824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.805864 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.805884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.808720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.808772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.808799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.809270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.809313 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:09 crc kubenswrapper[4898]: I0313 13:56:09.809336 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.814150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0"} Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.814216 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883"} Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.814235 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84"} Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.814246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb"} Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.814279 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.814376 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815213 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815885 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815932 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815935 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:10 crc kubenswrapper[4898]: I0313 13:56:10.815964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.823167 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.823189 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32"} Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.823168 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.824189 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.824234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.824249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.824872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.824978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.825042 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.924669 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.924956 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.926778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.926856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.926882 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:56:11 crc kubenswrapper[4898]: I0313 13:56:11.926894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.148006 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.150099 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.150165 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.150192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.150236 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.826245 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.828411 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.828490 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:12 crc kubenswrapper[4898]: I0313 13:56:12.828518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:13 crc kubenswrapper[4898]: I0313 13:56:13.527276 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:13 crc kubenswrapper[4898]: I0313 13:56:13.527572 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:13 crc kubenswrapper[4898]: I0313 13:56:13.529412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:13 crc kubenswrapper[4898]: I0313 13:56:13.529471 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:13 crc kubenswrapper[4898]: I0313 13:56:13.529489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.208750 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.209090 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.211101 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.211177 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.211195 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.486341 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.832655 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.834327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.834386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.834407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.924851 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:56:14 crc kubenswrapper[4898]: I0313 13:56:14.925027 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.511480 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.511722 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.513371 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.513458 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.513481 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:15 crc kubenswrapper[4898]: E0313 13:56:15.840452 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.849048 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.849409 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.851227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.851674 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:15 crc kubenswrapper[4898]: I0313 13:56:15.851830 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:18 crc kubenswrapper[4898]: I0313 13:56:18.361282 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:18 crc kubenswrapper[4898]: I0313 13:56:18.361468 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:18 crc kubenswrapper[4898]: I0313 13:56:18.363653 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:18 crc kubenswrapper[4898]: I0313 13:56:18.363724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:18 crc kubenswrapper[4898]: I0313 13:56:18.363746 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.170132 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.170438 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.172669 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.172720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.172742 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:19 crc kubenswrapper[4898]: W0313 13:56:19.401617 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.401728 4898 trace.go:236] Trace[935307713]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 13:56:09.400) (total time: 10001ms): Mar 13 13:56:19 crc kubenswrapper[4898]: Trace[935307713]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:56:19.401) Mar 13 13:56:19 crc kubenswrapper[4898]: Trace[935307713]: [10.001599085s] [10.001599085s] END Mar 13 13:56:19 crc kubenswrapper[4898]: E0313 13:56:19.401759 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.673767 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 13 13:56:19 crc kubenswrapper[4898]: W0313 13:56:19.775451 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.775636 4898 trace.go:236] Trace[1488825826]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 13:56:09.774) (total time: 10001ms): Mar 13 13:56:19 crc kubenswrapper[4898]: Trace[1488825826]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:56:19.775) Mar 13 13:56:19 crc kubenswrapper[4898]: Trace[1488825826]: [10.001535793s] [10.001535793s] END Mar 13 13:56:19 crc kubenswrapper[4898]: E0313 13:56:19.775682 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 13:56:19 crc kubenswrapper[4898]: W0313 13:56:19.941383 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 13:56:19 crc kubenswrapper[4898]: I0313 13:56:19.941513 4898 trace.go:236] Trace[1619469183]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 13:56:09.939) (total time: 10001ms): Mar 13 13:56:19 crc kubenswrapper[4898]: Trace[1619469183]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:56:19.941) Mar 13 13:56:19 crc kubenswrapper[4898]: Trace[1619469183]: [10.001902123s] [10.001902123s] END Mar 13 13:56:19 crc kubenswrapper[4898]: E0313 13:56:19.941550 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 13:56:20 crc kubenswrapper[4898]: W0313 13:56:20.463817 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 13:56:20 crc kubenswrapper[4898]: I0313 13:56:20.463959 4898 trace.go:236] Trace[370143304]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 13:56:10.461) (total time: 10002ms): Mar 13 13:56:20 crc kubenswrapper[4898]: Trace[370143304]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:56:20.463) Mar 13 13:56:20 crc kubenswrapper[4898]: Trace[370143304]: [10.002052266s] [10.002052266s] END Mar 13 13:56:20 crc kubenswrapper[4898]: E0313 13:56:20.463984 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 13:56:20 crc kubenswrapper[4898]: E0313 13:56:20.939631 4898 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:20 crc kubenswrapper[4898]: E0313 13:56:20.941044 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:20Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 13:56:20 crc kubenswrapper[4898]: E0313 13:56:20.941561 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:20Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 13:56:20 crc kubenswrapper[4898]: E0313 13:56:20.951774 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:20Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c6b2655e59fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,LastTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:56:20 crc kubenswrapper[4898]: I0313 13:56:20.955210 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:20Z is after 2026-02-23T05:33:13Z Mar 13 13:56:20 crc kubenswrapper[4898]: I0313 13:56:20.964699 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 13:56:20 crc kubenswrapper[4898]: I0313 13:56:20.964801 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 13:56:20 crc kubenswrapper[4898]: I0313 13:56:20.976725 4898 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 13:56:20 crc kubenswrapper[4898]: I0313 13:56:20.976818 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.675807 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:21Z is after 2026-02-23T05:33:13Z Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.870592 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.873099 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a280781bd4308bb8d58704c762393be7b1043a48c9bcf32e578e39be0a70478d" exitCode=255 Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.873160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a280781bd4308bb8d58704c762393be7b1043a48c9bcf32e578e39be0a70478d"} Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.873390 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.874329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.874366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.874379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:21 crc kubenswrapper[4898]: I0313 13:56:21.875077 4898 scope.go:117] "RemoveContainer" containerID="a280781bd4308bb8d58704c762393be7b1043a48c9bcf32e578e39be0a70478d" Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.679272 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:22Z is after 2026-02-23T05:33:13Z Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.878335 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.880484 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e"} Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.880668 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.882011 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.882162 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:22 crc kubenswrapper[4898]: I0313 13:56:22.882252 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.677427 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:23Z is after 2026-02-23T05:33:13Z Mar 13 13:56:23 crc kubenswrapper[4898]: W0313 13:56:23.792348 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:23Z is after 2026-02-23T05:33:13Z Mar 13 13:56:23 crc kubenswrapper[4898]: E0313 13:56:23.792481 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.885556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.886597 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.889233 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" exitCode=255 Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.889320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e"} Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.889429 4898 scope.go:117] "RemoveContainer" containerID="a280781bd4308bb8d58704c762393be7b1043a48c9bcf32e578e39be0a70478d" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.889974 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.892842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.892955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.892987 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:23 crc kubenswrapper[4898]: I0313 13:56:23.894756 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:23 crc kubenswrapper[4898]: E0313 13:56:23.895431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:24 crc kubenswrapper[4898]: W0313 13:56:24.500019 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:24Z is after 2026-02-23T05:33:13Z Mar 13 13:56:24 crc kubenswrapper[4898]: E0313 13:56:24.500124 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.501045 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:24 crc kubenswrapper[4898]: W0313 13:56:24.599817 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:24Z is after 2026-02-23T05:33:13Z Mar 13 13:56:24 crc kubenswrapper[4898]: E0313 13:56:24.599963 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.674867 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:24Z is after 2026-02-23T05:33:13Z Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.893839 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.896198 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.897271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.897355 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.897413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.898613 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:24 crc kubenswrapper[4898]: E0313 13:56:24.899082 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.902731 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.926793 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.926995 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 13:56:24 crc kubenswrapper[4898]: I0313 13:56:24.972024 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:25 crc kubenswrapper[4898]: I0313 13:56:25.674981 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:25Z is after 2026-02-23T05:33:13Z Mar 13 13:56:25 crc kubenswrapper[4898]: E0313 13:56:25.840651 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:56:25 crc kubenswrapper[4898]: I0313 13:56:25.898852 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:25 crc kubenswrapper[4898]: I0313 13:56:25.899930 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:25 crc kubenswrapper[4898]: I0313 13:56:25.900002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:25 crc kubenswrapper[4898]: I0313 13:56:25.900013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:25 crc kubenswrapper[4898]: I0313 13:56:25.900517 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:25 crc kubenswrapper[4898]: E0313 13:56:25.900683 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.399208 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:26 crc kubenswrapper[4898]: W0313 13:56:26.471773 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:26Z is after 2026-02-23T05:33:13Z Mar 13 13:56:26 crc kubenswrapper[4898]: E0313 13:56:26.471934 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.674653 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:26Z is after 2026-02-23T05:33:13Z Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.901939 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.903275 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.903325 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.903344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:26 crc kubenswrapper[4898]: I0313 13:56:26.904195 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:26 crc kubenswrapper[4898]: E0313 13:56:26.904479 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.342185 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.343863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.343988 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.344015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.344068 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:27 crc kubenswrapper[4898]: E0313 13:56:27.348247 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:27Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 13:56:27 crc kubenswrapper[4898]: E0313 13:56:27.351728 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:27Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.686090 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:27Z is after 2026-02-23T05:33:13Z Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.907641 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.910158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.910234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.910254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:27 crc kubenswrapper[4898]: I0313 13:56:27.911346 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:27 crc kubenswrapper[4898]: E0313 13:56:27.911674 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:28 crc kubenswrapper[4898]: I0313 13:56:28.677601 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:28Z is after 2026-02-23T05:33:13Z Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.206465 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.206792 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.208534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.208600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.208624 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.225167 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.658967 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:56:29 crc kubenswrapper[4898]: E0313 13:56:29.665518 4898 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.676789 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:29Z is after 2026-02-23T05:33:13Z Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.914220 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.915447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.915653 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:29 crc kubenswrapper[4898]: I0313 13:56:29.915808 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:30 crc kubenswrapper[4898]: I0313 13:56:30.675313 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:30Z is after 2026-02-23T05:33:13Z Mar 13 13:56:30 crc kubenswrapper[4898]: E0313 13:56:30.957854 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:30Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c6b2655e59fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,LastTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:56:31 crc kubenswrapper[4898]: I0313 13:56:31.676968 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:31Z is after 2026-02-23T05:33:13Z Mar 13 13:56:31 crc kubenswrapper[4898]: W0313 13:56:31.902271 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:31Z is after 2026-02-23T05:33:13Z Mar 13 13:56:31 crc kubenswrapper[4898]: E0313 13:56:31.902443 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:32 crc kubenswrapper[4898]: I0313 13:56:32.677348 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:32Z is after 2026-02-23T05:33:13Z Mar 13 13:56:33 crc kubenswrapper[4898]: I0313 13:56:33.678353 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:33Z is after 2026-02-23T05:33:13Z Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.352890 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:34 crc kubenswrapper[4898]: E0313 13:56:34.354696 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:34Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.355485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.355528 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.355544 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.355580 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:34 crc kubenswrapper[4898]: E0313 13:56:34.359405 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 13:56:34 crc kubenswrapper[4898]: W0313 13:56:34.571216 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:34Z is after 2026-02-23T05:33:13Z Mar 13 13:56:34 crc kubenswrapper[4898]: E0313 13:56:34.571362 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.676464 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:34Z is after 2026-02-23T05:33:13Z Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.925531 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.925619 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.925692 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.925944 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.927483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.927536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.927547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.928242 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 13:56:34 crc kubenswrapper[4898]: I0313 13:56:34.928466 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3" gracePeriod=30 Mar 13 13:56:35 crc kubenswrapper[4898]: W0313 13:56:35.207346 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:35Z is after 2026-02-23T05:33:13Z Mar 13 13:56:35 crc kubenswrapper[4898]: E0313 13:56:35.207469 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:35 crc kubenswrapper[4898]: I0313 13:56:35.677186 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:35Z is after 2026-02-23T05:33:13Z Mar 13 13:56:35 crc kubenswrapper[4898]: E0313 13:56:35.841486 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:56:35 crc kubenswrapper[4898]: I0313 13:56:35.945148 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 13:56:35 crc kubenswrapper[4898]: I0313 13:56:35.946431 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3" exitCode=255 Mar 13 13:56:35 crc kubenswrapper[4898]: I0313 13:56:35.946495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3"} Mar 13 13:56:35 crc kubenswrapper[4898]: I0313 13:56:35.946540 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f"} Mar 13 13:56:36 crc kubenswrapper[4898]: I0313 13:56:36.678979 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:36Z is after 2026-02-23T05:33:13Z Mar 13 13:56:36 crc kubenswrapper[4898]: I0313 13:56:36.950306 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:36 crc kubenswrapper[4898]: I0313 13:56:36.951690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:36 crc kubenswrapper[4898]: I0313 13:56:36.951764 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:36 crc kubenswrapper[4898]: I0313 13:56:36.951793 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:37 crc kubenswrapper[4898]: W0313 13:56:37.354081 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:37Z is after 2026-02-23T05:33:13Z Mar 13 13:56:37 crc kubenswrapper[4898]: E0313 13:56:37.354262 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:37 crc kubenswrapper[4898]: I0313 13:56:37.678352 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:37Z is after 2026-02-23T05:33:13Z Mar 13 13:56:38 crc kubenswrapper[4898]: I0313 13:56:38.678103 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:38Z is after 2026-02-23T05:33:13Z Mar 13 13:56:39 crc kubenswrapper[4898]: I0313 13:56:39.677330 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:39Z is after 2026-02-23T05:33:13Z Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.677262 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:40Z is after 2026-02-23T05:33:13Z Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.739568 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.741204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.741388 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.741501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.742365 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:40 crc kubenswrapper[4898]: E0313 13:56:40.962742 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:40Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c6b2655e59fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,LastTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.964961 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.968153 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5"} Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.968335 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.969623 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.969695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:40 crc kubenswrapper[4898]: I0313 13:56:40.969722 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:41 crc kubenswrapper[4898]: E0313 13:56:41.357660 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:41Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.359940 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.361499 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.361534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.361543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.361565 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:41 crc kubenswrapper[4898]: E0313 13:56:41.364375 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:41Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.676066 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:41Z is after 2026-02-23T05:33:13Z Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.925491 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.925776 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.927779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.927830 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.927841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.974416 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.975326 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.978134 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" exitCode=255 Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.978202 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5"} Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.978270 4898 scope.go:117] "RemoveContainer" containerID="5bf772817348d67609841cf984af05178c10e8f4fba932917ec33aa63fd1028e" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.978518 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.979888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.979965 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.979984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:41 crc kubenswrapper[4898]: I0313 13:56:41.981121 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:56:41 crc kubenswrapper[4898]: E0313 13:56:41.981456 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:42 crc kubenswrapper[4898]: I0313 13:56:42.678234 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:42Z is after 2026-02-23T05:33:13Z Mar 13 13:56:42 crc kubenswrapper[4898]: I0313 13:56:42.983516 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 13:56:43 crc kubenswrapper[4898]: I0313 13:56:43.528334 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:56:43 crc kubenswrapper[4898]: I0313 13:56:43.528629 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:43 crc kubenswrapper[4898]: I0313 13:56:43.530506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:43 crc kubenswrapper[4898]: I0313 13:56:43.530576 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:43 crc kubenswrapper[4898]: I0313 13:56:43.530588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:43 crc kubenswrapper[4898]: I0313 13:56:43.677216 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:43Z is after 2026-02-23T05:33:13Z Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.675720 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:44Z is after 2026-02-23T05:33:13Z Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.925603 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.925734 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.971245 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.971478 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.973084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.973154 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.973174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:44 crc kubenswrapper[4898]: I0313 13:56:44.974162 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:56:44 crc kubenswrapper[4898]: E0313 13:56:44.974562 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:45 crc kubenswrapper[4898]: I0313 13:56:45.676863 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:45Z is after 2026-02-23T05:33:13Z Mar 13 13:56:45 crc kubenswrapper[4898]: E0313 13:56:45.842305 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.399170 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.399482 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.401228 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.401296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.401310 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.402118 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:56:46 crc kubenswrapper[4898]: E0313 13:56:46.402352 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.603123 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:56:46 crc kubenswrapper[4898]: E0313 13:56:46.609446 4898 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:46 crc kubenswrapper[4898]: E0313 13:56:46.610774 4898 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 13 13:56:46 crc kubenswrapper[4898]: I0313 13:56:46.676280 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:46Z is after 2026-02-23T05:33:13Z Mar 13 13:56:47 crc kubenswrapper[4898]: I0313 13:56:47.678122 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:47Z is after 2026-02-23T05:33:13Z Mar 13 13:56:48 crc kubenswrapper[4898]: E0313 13:56:48.362353 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:48Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 13:56:48 crc kubenswrapper[4898]: I0313 13:56:48.365619 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:48 crc kubenswrapper[4898]: I0313 13:56:48.367754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:48 crc kubenswrapper[4898]: I0313 13:56:48.367828 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:48 crc kubenswrapper[4898]: I0313 13:56:48.367855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:48 crc kubenswrapper[4898]: I0313 13:56:48.367935 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:48 crc kubenswrapper[4898]: E0313 13:56:48.373942 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:48Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 13:56:48 crc kubenswrapper[4898]: I0313 13:56:48.677093 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:48Z is after 2026-02-23T05:33:13Z Mar 13 13:56:49 crc kubenswrapper[4898]: I0313 13:56:49.677937 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:49Z is after 2026-02-23T05:33:13Z Mar 13 13:56:50 crc kubenswrapper[4898]: I0313 13:56:50.675865 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:50Z is after 2026-02-23T05:33:13Z Mar 13 13:56:50 crc kubenswrapper[4898]: E0313 13:56:50.967635 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:50Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c6b2655e59fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,LastTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:56:51 crc kubenswrapper[4898]: I0313 13:56:51.677137 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:51Z is after 2026-02-23T05:33:13Z Mar 13 13:56:51 crc kubenswrapper[4898]: W0313 13:56:51.958385 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:51Z is after 2026-02-23T05:33:13Z Mar 13 13:56:51 crc kubenswrapper[4898]: E0313 13:56:51.958502 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:51Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:52 crc kubenswrapper[4898]: I0313 13:56:52.675697 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:52Z is after 2026-02-23T05:33:13Z Mar 13 13:56:53 crc kubenswrapper[4898]: I0313 13:56:53.677706 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:53Z is after 2026-02-23T05:33:13Z Mar 13 13:56:53 crc kubenswrapper[4898]: W0313 13:56:53.800618 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:53Z is after 2026-02-23T05:33:13Z Mar 13 13:56:53 crc kubenswrapper[4898]: E0313 13:56:53.800743 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:54 crc kubenswrapper[4898]: I0313 13:56:54.678192 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:54Z is after 2026-02-23T05:33:13Z Mar 13 13:56:54 crc kubenswrapper[4898]: I0313 13:56:54.926283 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:56:54 crc kubenswrapper[4898]: I0313 13:56:54.926421 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 13:56:55 crc kubenswrapper[4898]: E0313 13:56:55.366752 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:55Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.375023 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.376292 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.376330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.376340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.376365 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:56:55 crc kubenswrapper[4898]: E0313 13:56:55.381068 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.677505 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:55Z is after 2026-02-23T05:33:13Z Mar 13 13:56:55 crc kubenswrapper[4898]: E0313 13:56:55.843051 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.858124 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.858355 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.859814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.859866 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:56:55 crc kubenswrapper[4898]: I0313 13:56:55.859882 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:56:55 crc kubenswrapper[4898]: W0313 13:56:55.903099 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:55Z is after 2026-02-23T05:33:13Z Mar 13 13:56:55 crc kubenswrapper[4898]: E0313 13:56:55.903240 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:56:56 crc kubenswrapper[4898]: I0313 13:56:56.675597 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:56Z is after 2026-02-23T05:33:13Z Mar 13 13:56:57 crc kubenswrapper[4898]: I0313 13:56:57.676684 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:57Z is after 2026-02-23T05:33:13Z Mar 13 13:56:58 crc kubenswrapper[4898]: I0313 13:56:58.676506 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:58Z is after 2026-02-23T05:33:13Z Mar 13 13:56:59 crc kubenswrapper[4898]: I0313 13:56:59.677889 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:56:59Z is after 2026-02-23T05:33:13Z Mar 13 13:57:00 crc kubenswrapper[4898]: W0313 13:57:00.047818 4898 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:00Z is after 2026-02-23T05:33:13Z Mar 13 13:57:00 crc kubenswrapper[4898]: E0313 13:57:00.047986 4898 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:00Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 13:57:00 crc kubenswrapper[4898]: I0313 13:57:00.681005 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:00 crc kubenswrapper[4898]: E0313 13:57:00.974450 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2655e59fd5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,LastTimestamp:2026-03-13 13:56:05.671174101 +0000 UTC m=+0.672762380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:00 crc kubenswrapper[4898]: E0313 13:57:00.981411 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:00 crc kubenswrapper[4898]: E0313 13:57:00.987772 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:00 crc kubenswrapper[4898]: E0313 13:57:00.994783 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.001047 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265fd0bfcc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.837578188 +0000 UTC m=+0.839166467,LastTimestamp:2026-03-13 13:56:05.837578188 +0000 UTC m=+0.839166467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.006787 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.841117689 +0000 UTC m=+0.842705938,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.011846 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.841131879 +0000 UTC m=+0.842720128,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.018040 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b265984ec76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.84114297 +0000 UTC m=+0.842731219,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.024831 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.842279859 +0000 UTC m=+0.843868108,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.030638 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.842291419 +0000 UTC m=+0.843879668,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.035430 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b265984ec76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.84230087 +0000 UTC m=+0.843889119,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.040459 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.843297825 +0000 UTC m=+0.844886064,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.047257 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.843321256 +0000 UTC m=+0.844909505,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.050368 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b265984ec76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.843335006 +0000 UTC m=+0.844923255,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.052167 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.846246212 +0000 UTC m=+0.847834541,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.057812 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.846279852 +0000 UTC m=+0.847868131,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.063056 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b265984ec76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.846326464 +0000 UTC m=+0.847914753,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.067743 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.846375975 +0000 UTC m=+0.847964284,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.074151 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.846441437 +0000 UTC m=+0.848029706,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.080426 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b265984ec76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.846473377 +0000 UTC m=+0.848061656,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.086663 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.849530876 +0000 UTC m=+0.851119165,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.093166 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.849557337 +0000 UTC m=+0.851145626,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.099743 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b265984ec76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b265984ec76 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.73194559 +0000 UTC m=+0.733533829,LastTimestamp:2026-03-13 13:56:05.849577918 +0000 UTC m=+0.851166197,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.105876 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b26598276b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b26598276b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731784376 +0000 UTC m=+0.733372615,LastTimestamp:2026-03-13 13:56:05.851557679 +0000 UTC m=+0.853145968,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.109857 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c6b2659839ba8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c6b2659839ba8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:05.731859368 +0000 UTC m=+0.733447607,LastTimestamp:2026-03-13 13:56:05.851581329 +0000 UTC m=+0.853169608,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.118177 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b26789bb807 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.253533191 +0000 UTC m=+1.255121440,LastTimestamp:2026-03-13 13:56:06.253533191 +0000 UTC m=+1.255121440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.122494 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26789d5084 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.253637764 +0000 UTC m=+1.255226003,LastTimestamp:2026-03-13 13:56:06.253637764 +0000 UTC m=+1.255226003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.127546 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b2678f7e1c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.259573187 +0000 UTC m=+1.261161426,LastTimestamp:2026-03-13 13:56:06.259573187 +0000 UTC m=+1.261161426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.131643 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26795f5ed5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.266355413 +0000 UTC m=+1.267943652,LastTimestamp:2026-03-13 13:56:06.266355413 +0000 UTC m=+1.267943652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.136532 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6b267a9a36af openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.286988975 +0000 UTC m=+1.288577214,LastTimestamp:2026-03-13 13:56:06.286988975 +0000 UTC m=+1.288577214,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.140672 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b269ce283a3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.862152611 +0000 UTC m=+1.863740890,LastTimestamp:2026-03-13 13:56:06.862152611 +0000 UTC m=+1.863740890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.144574 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b269d151f8f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.865469327 +0000 UTC m=+1.867057576,LastTimestamp:2026-03-13 13:56:06.865469327 +0000 UTC m=+1.867057576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.148406 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b269d2401c6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.866444742 +0000 UTC m=+1.868032981,LastTimestamp:2026-03-13 13:56:06.866444742 +0000 UTC m=+1.868032981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.150956 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6b269d63410d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.870589709 +0000 UTC m=+1.872177968,LastTimestamp:2026-03-13 13:56:06.870589709 +0000 UTC m=+1.872177968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.154143 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b269d7c6dac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.872239532 +0000 UTC m=+1.873827781,LastTimestamp:2026-03-13 13:56:06.872239532 +0000 UTC m=+1.873827781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.159194 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b269d7d4ad9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.872296153 +0000 UTC m=+1.873884432,LastTimestamp:2026-03-13 13:56:06.872296153 +0000 UTC m=+1.873884432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.162684 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b269da3aa8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.874811018 +0000 UTC m=+1.876399287,LastTimestamp:2026-03-13 13:56:06.874811018 +0000 UTC m=+1.876399287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.165914 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b269e34f892 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.884333714 +0000 UTC m=+1.885921993,LastTimestamp:2026-03-13 13:56:06.884333714 +0000 UTC m=+1.885921993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.171408 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b269e48ee78 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.885641848 +0000 UTC m=+1.887230097,LastTimestamp:2026-03-13 13:56:06.885641848 +0000 UTC m=+1.887230097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.174987 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6b269ebe40c3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.893330627 +0000 UTC m=+1.894918876,LastTimestamp:2026-03-13 13:56:06.893330627 +0000 UTC m=+1.894918876,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.178891 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b269ef83e89 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.897131145 +0000 UTC m=+1.898719394,LastTimestamp:2026-03-13 13:56:06.897131145 +0000 UTC m=+1.898719394,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.182082 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b020814e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.18498235 +0000 UTC m=+2.186570589,LastTimestamp:2026-03-13 13:56:07.18498235 +0000 UTC m=+2.186570589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.185598 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b0f08c60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.198616672 +0000 UTC m=+2.200204911,LastTimestamp:2026-03-13 13:56:07.198616672 +0000 UTC m=+2.200204911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.189245 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b10c9b0f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.200455439 +0000 UTC m=+2.202043678,LastTimestamp:2026-03-13 13:56:07.200455439 +0000 UTC m=+2.202043678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.193112 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26bc7950ad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.392129197 +0000 UTC m=+2.393717436,LastTimestamp:2026-03-13 13:56:07.392129197 +0000 UTC m=+2.393717436,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.196338 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26bd37793e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.404591422 +0000 UTC m=+2.406179681,LastTimestamp:2026-03-13 13:56:07.404591422 +0000 UTC m=+2.406179681,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.199801 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26bd4fc2e5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.406183141 +0000 UTC m=+2.407771380,LastTimestamp:2026-03-13 13:56:07.406183141 +0000 UTC m=+2.407771380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.203458 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26c758c092 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.57454453 +0000 UTC m=+2.576132809,LastTimestamp:2026-03-13 13:56:07.57454453 +0000 UTC m=+2.576132809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.206415 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26c81e08ee openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.587473646 +0000 UTC m=+2.589061885,LastTimestamp:2026-03-13 13:56:07.587473646 +0000 UTC m=+2.589061885,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.209835 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6b26d25ca1d4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.75934818 +0000 UTC m=+2.760936449,LastTimestamp:2026-03-13 13:56:07.75934818 +0000 UTC m=+2.760936449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.212955 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26d26dfc2a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.760485418 +0000 UTC m=+2.762073657,LastTimestamp:2026-03-13 13:56:07.760485418 +0000 UTC m=+2.762073657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.216319 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26d2e1251f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.768032543 +0000 UTC m=+2.769620822,LastTimestamp:2026-03-13 13:56:07.768032543 +0000 UTC m=+2.769620822,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.220161 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b26d3727b44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.777557316 +0000 UTC m=+2.779145595,LastTimestamp:2026-03-13 13:56:07.777557316 +0000 UTC m=+2.779145595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.224622 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26e0de7e48 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.002739784 +0000 UTC m=+3.004328023,LastTimestamp:2026-03-13 13:56:08.002739784 +0000 UTC m=+3.004328023,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.228603 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b26e12a8344 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.007721796 +0000 UTC m=+3.009310035,LastTimestamp:2026-03-13 13:56:08.007721796 +0000 UTC m=+3.009310035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.231723 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6b26e133cfc1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.008331201 +0000 UTC m=+3.009919430,LastTimestamp:2026-03-13 13:56:08.008331201 +0000 UTC m=+3.009919430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.236144 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26e13b0031 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.008802353 +0000 UTC m=+3.010390592,LastTimestamp:2026-03-13 13:56:08.008802353 +0000 UTC m=+3.010390592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.240028 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26e153f1ef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.010437103 +0000 UTC m=+3.012025342,LastTimestamp:2026-03-13 13:56:08.010437103 +0000 UTC m=+3.012025342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.245542 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26e16648f8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.011639032 +0000 UTC m=+3.013227271,LastTimestamp:2026-03-13 13:56:08.011639032 +0000 UTC m=+3.013227271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.249371 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c6b26e1fc2a62 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.021461602 +0000 UTC m=+3.023049841,LastTimestamp:2026-03-13 13:56:08.021461602 +0000 UTC m=+3.023049841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.252938 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26e26971cb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.028623307 +0000 UTC m=+3.030211546,LastTimestamp:2026-03-13 13:56:08.028623307 +0000 UTC m=+3.030211546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.257596 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26e27a1b4a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.029715274 +0000 UTC m=+3.031303513,LastTimestamp:2026-03-13 13:56:08.029715274 +0000 UTC m=+3.031303513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.262750 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b26e3438c53 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.042916947 +0000 UTC m=+3.044505186,LastTimestamp:2026-03-13 13:56:08.042916947 +0000 UTC m=+3.044505186,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.268005 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26eccc2288 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.20286324 +0000 UTC m=+3.204451479,LastTimestamp:2026-03-13 13:56:08.20286324 +0000 UTC m=+3.204451479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.271741 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26eccf1744 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.203056964 +0000 UTC m=+3.204645203,LastTimestamp:2026-03-13 13:56:08.203056964 +0000 UTC m=+3.204645203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.275958 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26ee650b73 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.229661555 +0000 UTC m=+3.231249794,LastTimestamp:2026-03-13 13:56:08.229661555 +0000 UTC m=+3.231249794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.280377 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26ee668ae6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.229759718 +0000 UTC m=+3.231347977,LastTimestamp:2026-03-13 13:56:08.229759718 +0000 UTC m=+3.231347977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.287410 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26ee7ca2ef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.231207663 +0000 UTC m=+3.232795912,LastTimestamp:2026-03-13 13:56:08.231207663 +0000 UTC m=+3.232795912,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.292733 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26ee867073 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.231850099 +0000 UTC m=+3.233438348,LastTimestamp:2026-03-13 13:56:08.231850099 +0000 UTC m=+3.233438348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.296990 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26fb12bd82 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.442371458 +0000 UTC m=+3.443959697,LastTimestamp:2026-03-13 13:56:08.442371458 +0000 UTC m=+3.443959697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.301584 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26fb2332a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.443450025 +0000 UTC m=+3.445038264,LastTimestamp:2026-03-13 13:56:08.443450025 +0000 UTC m=+3.445038264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.307363 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c6b26fbc3b29e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.453968542 +0000 UTC m=+3.455556791,LastTimestamp:2026-03-13 13:56:08.453968542 +0000 UTC m=+3.455556791,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.312202 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26fc06d892 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.45836917 +0000 UTC m=+3.459957409,LastTimestamp:2026-03-13 13:56:08.45836917 +0000 UTC m=+3.459957409,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.316599 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b26fc26c94b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.460462411 +0000 UTC m=+3.462050650,LastTimestamp:2026-03-13 13:56:08.460462411 +0000 UTC m=+3.462050650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.321105 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27057d8a46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.617142854 +0000 UTC m=+3.618731083,LastTimestamp:2026-03-13 13:56:08.617142854 +0000 UTC m=+3.618731083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.324800 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b2706188a4e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.627300942 +0000 UTC m=+3.628889181,LastTimestamp:2026-03-13 13:56:08.627300942 +0000 UTC m=+3.628889181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.331650 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27062e52a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.628728487 +0000 UTC m=+3.630316726,LastTimestamp:2026-03-13 13:56:08.628728487 +0000 UTC m=+3.630316726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.336308 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b270fd870b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.790872243 +0000 UTC m=+3.792460482,LastTimestamp:2026-03-13 13:56:08.790872243 +0000 UTC m=+3.792460482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.341751 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b271398acd1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.853802193 +0000 UTC m=+3.855390432,LastTimestamp:2026-03-13 13:56:08.853802193 +0000 UTC m=+3.855390432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.346754 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27145e2f76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.86674623 +0000 UTC m=+3.868334469,LastTimestamp:2026-03-13 13:56:08.86674623 +0000 UTC m=+3.868334469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.352007 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b271a6bceea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.968302314 +0000 UTC m=+3.969890553,LastTimestamp:2026-03-13 13:56:08.968302314 +0000 UTC m=+3.969890553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.357409 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b271aff9aff openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.977988351 +0000 UTC m=+3.979576590,LastTimestamp:2026-03-13 13:56:08.977988351 +0000 UTC m=+3.979576590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.364577 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b274cb31f19 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:09.811836697 +0000 UTC m=+4.813424956,LastTimestamp:2026-03-13 13:56:09.811836697 +0000 UTC m=+4.813424956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.369983 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2758f9e5cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.017801676 +0000 UTC m=+5.019389915,LastTimestamp:2026-03-13 13:56:10.017801676 +0000 UTC m=+5.019389915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.374716 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2759aa1106 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.029347078 +0000 UTC m=+5.030935317,LastTimestamp:2026-03-13 13:56:10.029347078 +0000 UTC m=+5.030935317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.380009 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2759b96769 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.030352233 +0000 UTC m=+5.031940482,LastTimestamp:2026-03-13 13:56:10.030352233 +0000 UTC m=+5.031940482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.384132 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b276776274a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.26082593 +0000 UTC m=+5.262414169,LastTimestamp:2026-03-13 13:56:10.26082593 +0000 UTC m=+5.262414169,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.388711 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2768568eca openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.27553249 +0000 UTC m=+5.277120729,LastTimestamp:2026-03-13 13:56:10.27553249 +0000 UTC m=+5.277120729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.392785 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2768712977 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.277276023 +0000 UTC m=+5.278864272,LastTimestamp:2026-03-13 13:56:10.277276023 +0000 UTC m=+5.278864272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.396996 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2773f6aee1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.470575841 +0000 UTC m=+5.472164090,LastTimestamp:2026-03-13 13:56:10.470575841 +0000 UTC m=+5.472164090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.401372 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2774ccdca9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.484612265 +0000 UTC m=+5.486200544,LastTimestamp:2026-03-13 13:56:10.484612265 +0000 UTC m=+5.486200544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.405168 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2774e4daa5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.486184613 +0000 UTC m=+5.487772862,LastTimestamp:2026-03-13 13:56:10.486184613 +0000 UTC m=+5.487772862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.409121 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b27851efca7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.758429863 +0000 UTC m=+5.760018102,LastTimestamp:2026-03-13 13:56:10.758429863 +0000 UTC m=+5.760018102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.413879 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b278625f39a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.775663514 +0000 UTC m=+5.777251773,LastTimestamp:2026-03-13 13:56:10.775663514 +0000 UTC m=+5.777251773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.417536 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b27863c6c02 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.77713613 +0000 UTC m=+5.778724369,LastTimestamp:2026-03-13 13:56:10.77713613 +0000 UTC m=+5.778724369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.422075 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2792d58823 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.988496931 +0000 UTC m=+5.990085170,LastTimestamp:2026-03-13 13:56:10.988496931 +0000 UTC m=+5.990085170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.425919 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c6b2793710342 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:10.99868653 +0000 UTC m=+6.000274769,LastTimestamp:2026-03-13 13:56:10.99868653 +0000 UTC m=+6.000274769,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.432099 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.435929 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.441169 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5776e3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 13:57:01 crc kubenswrapper[4898]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 13:57:01 crc kubenswrapper[4898]: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964773436 +0000 UTC m=+15.966361695,LastTimestamp:2026-03-13 13:56:20.964773436 +0000 UTC m=+15.966361695,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.445021 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5787021 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964839457 +0000 UTC m=+15.966427716,LastTimestamp:2026-03-13 13:56:20.964839457 +0000 UTC m=+15.966427716,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.449254 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b29e5776e3c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5776e3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 13:57:01 crc kubenswrapper[4898]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 13:57:01 crc kubenswrapper[4898]: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964773436 +0000 UTC m=+15.966361695,LastTimestamp:2026-03-13 13:56:20.97679212 +0000 UTC m=+15.978380369,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.454467 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b29e5787021\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b29e5787021 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:20.964839457 +0000 UTC m=+15.966427716,LastTimestamp:2026-03-13 13:56:20.976852041 +0000 UTC m=+15.978440300,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.456609 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b27062e52a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27062e52a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.628728487 +0000 UTC m=+3.630316726,LastTimestamp:2026-03-13 13:56:21.876616881 +0000 UTC m=+16.878205130,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.458491 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b271398acd1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b271398acd1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.853802193 +0000 UTC m=+3.855390432,LastTimestamp:2026-03-13 13:56:22.083996564 +0000 UTC m=+17.085584803,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.461500 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c6b27145e2f76\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c6b27145e2f76 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:08.86674623 +0000 UTC m=+3.868334469,LastTimestamp:2026-03-13 13:56:22.09242852 +0000 UTC m=+17.094016759,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.466490 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d7785a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:24.926947356 +0000 UTC m=+19.928535635,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.470348 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d795e90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:24.927043028 +0000 UTC m=+19.928631307,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.476120 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d7785a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:34.925598071 +0000 UTC m=+29.927186310,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.480835 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d795e90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:34.925655033 +0000 UTC m=+29.927243272,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.485379 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b2d25c427ed openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:34.928445421 +0000 UTC m=+29.930033680,LastTimestamp:2026-03-13 13:56:34.928445421 +0000 UTC m=+29.930033680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.489930 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b269da3aa8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b269da3aa8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:06.874811018 +0000 UTC m=+1.876399287,LastTimestamp:2026-03-13 13:56:35.115195469 +0000 UTC m=+30.116783738,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.493782 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b26b020814e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b020814e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.18498235 +0000 UTC m=+2.186570589,LastTimestamp:2026-03-13 13:56:35.736330494 +0000 UTC m=+30.737918773,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.498948 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b26b0f08c60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b26b0f08c60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:07.198616672 +0000 UTC m=+2.200204911,LastTimestamp:2026-03-13 13:56:35.981696086 +0000 UTC m=+30.983284365,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.506122 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d7785a9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d7785a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.924981673 +0000 UTC m=+9.926569952,LastTimestamp:2026-03-13 13:56:44.925695304 +0000 UTC m=+39.927283573,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.512157 4898 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c6b287d795e90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c6b287d795e90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:14.925102736 +0000 UTC m=+9.926691015,LastTimestamp:2026-03-13 13:56:44.925830897 +0000 UTC m=+39.927419176,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 13:57:01 crc kubenswrapper[4898]: E0313 13:57:01.517685 4898 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 13:57:01 crc kubenswrapper[4898]: &Event{ObjectMeta:{kube-controller-manager-crc.189c6b31cdbc6f23 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 13:57:01 crc kubenswrapper[4898]: body: Mar 13 13:57:01 crc kubenswrapper[4898]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 13:56:54.926380835 +0000 UTC m=+49.927969104,LastTimestamp:2026-03-13 13:56:54.926380835 +0000 UTC m=+49.927969104,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 13:57:01 crc kubenswrapper[4898]: > Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.678353 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.738690 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.740775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.740846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.740867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:01 crc kubenswrapper[4898]: I0313 13:57:01.741878 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.043282 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.046561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3"} Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.046739 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.047740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.047772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.047781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:02 crc kubenswrapper[4898]: E0313 13:57:02.374452 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.381406 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383549 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383639 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.383820 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:02 crc kubenswrapper[4898]: E0313 13:57:02.391950 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 13:57:02 crc kubenswrapper[4898]: I0313 13:57:02.679758 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.051686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.052465 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055317 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" exitCode=255 Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3"} Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055426 4898 scope.go:117] "RemoveContainer" containerID="f9e87fbf50c4eff43d663444f50a95a72aca9900fc77a3d96f9c7cc7f66196c5" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.055637 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.056957 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.057142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.057270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.060408 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:03 crc kubenswrapper[4898]: E0313 13:57:03.060995 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:03 crc kubenswrapper[4898]: I0313 13:57:03.679006 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.061626 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.677101 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.925797 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.925941 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.926034 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.926283 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.927842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.927925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.927941 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.928551 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.928670 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f" gracePeriod=30 Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.971774 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.972039 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.973541 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.973590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.973601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:04 crc kubenswrapper[4898]: I0313 13:57:04.974242 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:04 crc kubenswrapper[4898]: E0313 13:57:04.974472 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.072632 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.075491 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.076701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f"} Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.076718 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f" exitCode=255 Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.076840 4898 scope.go:117] "RemoveContainer" containerID="11d7141d32caadeaeb1d436b8ea9c8ad8a7002577c20befa44b7c64ad366bee3" Mar 13 13:57:05 crc kubenswrapper[4898]: I0313 13:57:05.677303 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:05 crc kubenswrapper[4898]: E0313 13:57:05.843844 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.085697 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.087723 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9"} Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.087871 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.089395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.089454 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.089477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.399134 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.399447 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.401478 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.401537 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.401553 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.402340 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:06 crc kubenswrapper[4898]: E0313 13:57:06.402664 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:06 crc kubenswrapper[4898]: I0313 13:57:06.678531 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.090834 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.092225 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.092270 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.092285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:07 crc kubenswrapper[4898]: I0313 13:57:07.678267 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:08 crc kubenswrapper[4898]: I0313 13:57:08.677213 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:09 crc kubenswrapper[4898]: E0313 13:57:09.382722 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.392884 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394696 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.394762 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:09 crc kubenswrapper[4898]: E0313 13:57:09.402990 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 13:57:09 crc kubenswrapper[4898]: I0313 13:57:09.678230 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:10 crc kubenswrapper[4898]: I0313 13:57:10.676642 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.673002 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.924860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.925067 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.926345 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.926400 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.926422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:11 crc kubenswrapper[4898]: I0313 13:57:11.929378 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.104769 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.104875 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.105831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.105891 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.105928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:12 crc kubenswrapper[4898]: I0313 13:57:12.676566 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.107088 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.108098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.108140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.108156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:13 crc kubenswrapper[4898]: I0313 13:57:13.676582 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:14 crc kubenswrapper[4898]: I0313 13:57:14.676175 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:15 crc kubenswrapper[4898]: I0313 13:57:15.678535 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:15 crc kubenswrapper[4898]: E0313 13:57:15.845037 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:16 crc kubenswrapper[4898]: E0313 13:57:16.388026 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.403326 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.404871 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.405036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.405131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.405262 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:16 crc kubenswrapper[4898]: E0313 13:57:16.411805 4898 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 13:57:16 crc kubenswrapper[4898]: I0313 13:57:16.678097 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.679150 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.738646 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.740131 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.740519 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.740625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:17 crc kubenswrapper[4898]: I0313 13:57:17.741445 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:17 crc kubenswrapper[4898]: E0313 13:57:17.741738 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:18 crc kubenswrapper[4898]: I0313 13:57:18.612618 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 13:57:18 crc kubenswrapper[4898]: I0313 13:57:18.630449 4898 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 13:57:18 crc kubenswrapper[4898]: I0313 13:57:18.676855 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:19 crc kubenswrapper[4898]: I0313 13:57:19.681156 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.679368 4898 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.739574 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.741362 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.741419 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:20 crc kubenswrapper[4898]: I0313 13:57:20.741483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.225149 4898 csr.go:261] certificate signing request csr-lwx2n is approved, waiting to be issued Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.234825 4898 csr.go:257] certificate signing request csr-lwx2n is issued Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.302056 4898 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 13:57:21 crc kubenswrapper[4898]: I0313 13:57:21.528498 4898 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 13:57:22 crc kubenswrapper[4898]: I0313 13:57:22.237001 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 00:25:13.610928412 +0000 UTC Mar 13 13:57:22 crc kubenswrapper[4898]: I0313 13:57:22.237082 4898 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7138h27m51.37385236s for next certificate rotation Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.250658 4898 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.412403 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.415859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.416002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.416068 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.416495 4898 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.432682 4898 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.433155 4898 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.433178 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439442 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439471 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.439485 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.458003 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463782 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.463834 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.480491 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.486326 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.500539 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506134 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.506212 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:23Z","lastTransitionTime":"2026-03-13T13:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.520448 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.520683 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.520738 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.534721 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.535022 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.536552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.536609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:23 crc kubenswrapper[4898]: I0313 13:57:23.536635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.621287 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.722015 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.822402 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:23 crc kubenswrapper[4898]: E0313 13:57:23.923453 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.024088 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.125164 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.226152 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.326947 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.427816 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.528516 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.628817 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.729885 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.830810 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:24 crc kubenswrapper[4898]: E0313 13:57:24.932007 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.033144 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.134019 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.234411 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.335419 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.435580 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.536985 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.637939 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.739059 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.839968 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.846136 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:25 crc kubenswrapper[4898]: E0313 13:57:25.940994 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.041853 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.142215 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.243277 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.344322 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.445396 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.546294 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.646861 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.748042 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.848528 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:26 crc kubenswrapper[4898]: E0313 13:57:26.949521 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.051474 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.152062 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.252852 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.353741 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.454056 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.554947 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.656045 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.756457 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.857590 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:27 crc kubenswrapper[4898]: E0313 13:57:27.957766 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.058817 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.159754 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.260090 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.360296 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.461158 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.561422 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.662338 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.763561 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.864433 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:28 crc kubenswrapper[4898]: E0313 13:57:28.965528 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.066576 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.167474 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.267996 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.369005 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.470955 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.571481 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.672622 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.739448 4898 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.741563 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.741816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.742119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:29 crc kubenswrapper[4898]: I0313 13:57:29.743357 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.743999 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.773201 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.874588 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:29 crc kubenswrapper[4898]: E0313 13:57:29.975567 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.075713 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.176513 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.278021 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.379434 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.480196 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.580695 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.680950 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.781738 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.882486 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:30 crc kubenswrapper[4898]: E0313 13:57:30.983460 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.083619 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.183778 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.284618 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.385539 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.485732 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.586776 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.687760 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.787991 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.888967 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:31 crc kubenswrapper[4898]: E0313 13:57:31.990208 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.091503 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.192699 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.292984 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.394146 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.494960 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.595602 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.697066 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.798474 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.898614 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:32 crc kubenswrapper[4898]: E0313 13:57:32.999674 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.100664 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.201435 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.301678 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.402811 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.502932 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.578457 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583076 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583107 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.583119 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.593942 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597071 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.597124 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.607878 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.611952 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612465 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.612792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.625274 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630345 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:33 crc kubenswrapper[4898]: I0313 13:57:33.630381 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:33Z","lastTransitionTime":"2026-03-13T13:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.641167 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.641287 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.641319 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.742085 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.843141 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:33 crc kubenswrapper[4898]: E0313 13:57:33.943282 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.044091 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.145085 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.245829 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.347720 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.448444 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.549711 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.650788 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.751652 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.853139 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:34 crc kubenswrapper[4898]: E0313 13:57:34.953992 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.054416 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.154981 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.255602 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.356542 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.457566 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.557708 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.658134 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.758406 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.847271 4898 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.858559 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:35 crc kubenswrapper[4898]: E0313 13:57:35.958720 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.058886 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.160127 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.261511 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.362685 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.463021 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.563645 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.664254 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.765402 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.866010 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:36 crc kubenswrapper[4898]: E0313 13:57:36.967125 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.068192 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.168670 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.268958 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.370041 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.471028 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.572355 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.673575 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.774177 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.875332 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:37 crc kubenswrapper[4898]: E0313 13:57:37.975461 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.076440 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.177430 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.277547 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.378304 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.479248 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.579599 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.680141 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.780528 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.881414 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:38 crc kubenswrapper[4898]: E0313 13:57:38.982292 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.083434 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.183927 4898 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.237448 4898 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286716 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286761 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286774 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286793 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.286807 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389378 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389428 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.389466 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491890 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491928 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.491940 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594680 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594701 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.594747 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.697979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698077 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.698096 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.711239 4898 apiserver.go:52] "Watching apiserver" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.719064 4898 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.719509 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720040 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720066 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.720147 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720107 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720357 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.720404 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720611 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.720985 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.721048 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.724225 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.724511 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.726060 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.726385 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.726529 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.727223 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.727388 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.729944 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.730070 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.758033 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.775860 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.776197 4898 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.791255 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800696 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800780 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800811 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.800854 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.812355 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.822865 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837619 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837635 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837659 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837679 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837712 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837735 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837754 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837772 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837792 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837852 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837871 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837892 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837929 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837950 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.837995 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838015 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838037 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838078 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838099 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838158 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838178 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838315 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838338 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838378 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838416 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838443 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838461 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838480 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838502 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838543 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838564 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838589 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838632 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838651 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838690 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838679 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838710 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838806 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838881 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838948 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.838976 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839007 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839039 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839073 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839112 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839138 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839143 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839161 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839189 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839211 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839259 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839281 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839302 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839324 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839352 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839409 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839431 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839452 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839482 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839505 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839526 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839548 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839579 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839600 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839620 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839642 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839802 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839943 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839997 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840019 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840100 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840124 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840150 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840182 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840250 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841053 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841091 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841116 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841189 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841211 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841232 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841258 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841281 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841303 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841333 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839155 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841411 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839360 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839524 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839662 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839690 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.839735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840222 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840283 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840547 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840583 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.840998 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841114 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841170 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841488 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841834 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842481 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.841369 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842740 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842833 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842890 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842941 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842948 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.842983 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843039 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843166 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843551 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843809 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843894 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.843992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844049 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844095 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844111 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844387 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844485 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844617 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844654 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844692 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844729 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844847 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844882 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844115 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844334 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844419 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844482 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844541 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844734 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.844982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.845044 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.345008753 +0000 UTC m=+95.346597202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848077 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848118 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848148 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848172 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848191 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848228 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848247 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848270 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848295 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848319 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848342 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848368 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848391 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848420 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848445 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848473 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848496 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848521 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848565 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848590 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848635 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848658 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848687 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848708 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848753 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848774 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848829 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848853 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848877 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848916 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848943 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848969 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848990 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849014 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849038 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849061 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849087 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849170 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849195 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849218 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849271 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849297 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849324 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849350 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849404 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849429 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849457 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849635 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849692 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849780 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849910 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849968 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850047 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850062 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850075 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850089 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850104 4898 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850117 4898 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850129 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850141 4898 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850156 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850168 4898 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850181 4898 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850194 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850208 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850222 4898 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850234 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850253 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850267 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850298 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850312 4898 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850324 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850338 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850351 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850363 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850376 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850389 4898 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850402 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850414 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850425 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850438 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850450 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850463 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850475 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850487 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850500 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850516 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850528 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850539 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850552 4898 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850565 4898 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850577 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850594 4898 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850606 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848195 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848206 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.845092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.845348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846422 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846420 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854653 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846567 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846701 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846846 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846872 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.846992 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847126 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847191 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847242 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847336 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847390 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.847455 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848346 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848462 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.848861 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849355 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849348 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849691 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849750 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849918 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.849922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850058 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850054 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850140 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850137 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850233 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850631 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850639 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850840 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.850875 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851372 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851656 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851604 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851711 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851816 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851877 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.851907 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852101 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852142 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852157 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852158 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852225 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852502 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.852647 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853152 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853175 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853197 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853408 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853471 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853441 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.845052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853537 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.853774 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854115 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854159 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854193 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.854571 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855202 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855359 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855598 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855689 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.855865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856100 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856349 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.856873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857253 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857487 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857620 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857628 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.857640 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.857680 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858040 4898 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.858220 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.35811483 +0000 UTC m=+95.359703159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.858353 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.858456 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.358428777 +0000 UTC m=+95.360017236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858044 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.858783 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.859282 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.859352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860315 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860372 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.860788 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.861005 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.861370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.863626 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.864146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.865316 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866487 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866589 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866665 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.866982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.867171 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.872979 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.873028 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.873050 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.873143 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.373113569 +0000 UTC m=+95.374702018 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.876948 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.879579 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881096 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881129 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881143 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: E0313 13:57:39.881211 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:40.381192202 +0000 UTC m=+95.382780441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.883284 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.883535 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.883724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.884999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.886245 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.886943 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.888392 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.892255 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.892391 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.892847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.893168 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.893472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.894188 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.894475 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.896490 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.896911 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898146 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898212 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898493 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.898892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.897439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899012 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899040 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899148 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899176 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899259 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899335 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.899208 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900661 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900703 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.900745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.902735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.903093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.903188 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904399 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904441 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904476 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904488 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:39Z","lastTransitionTime":"2026-03-13T13:57:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.904818 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.921596 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.924009 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.930439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.930540 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951866 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951921 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951940 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951950 4898 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951960 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951972 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951982 4898 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951991 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952001 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.951992 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952010 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952075 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952086 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952097 4898 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952106 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952116 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952126 4898 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952135 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952146 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952155 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952165 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952175 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952185 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952195 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952207 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952256 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952271 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952284 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952296 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952308 4898 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952320 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952332 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952341 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952352 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952363 4898 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952373 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952384 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952394 4898 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952406 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952415 4898 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952425 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952436 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952448 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952457 4898 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952466 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952475 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952485 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952495 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952504 4898 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952513 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952524 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952533 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952542 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952551 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952562 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952571 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952580 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952593 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952601 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952611 4898 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952619 4898 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952629 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952638 4898 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952647 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952658 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952670 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952683 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952694 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952706 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952717 4898 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952728 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952739 4898 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952753 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952765 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952779 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952790 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952805 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952818 4898 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952829 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952838 4898 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952848 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952859 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952879 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952891 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952923 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952935 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952949 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952964 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952976 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.952995 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953007 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953020 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953030 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953038 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953047 4898 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953056 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953067 4898 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953076 4898 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953085 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953094 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953103 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953111 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953120 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953128 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953137 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953145 4898 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953154 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953162 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953171 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953180 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953188 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953196 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953207 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953218 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953229 4898 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953241 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953252 4898 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953264 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953288 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953299 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953308 4898 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953318 4898 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953328 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953337 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953346 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953357 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953366 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953374 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953383 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953391 4898 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953399 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953408 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953416 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953424 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953433 4898 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953441 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953449 4898 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953457 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953465 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953474 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953482 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953491 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953499 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953507 4898 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953515 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953524 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953532 4898 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953542 4898 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953550 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953558 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953566 4898 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953575 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:39 crc kubenswrapper[4898]: I0313 13:57:39.953584 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006918 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.006964 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.041063 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.057052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.067832 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 13:57:40 crc kubenswrapper[4898]: W0313 13:57:40.068264 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d WatchSource:0}: Error finding container bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d: Status 404 returned error can't find the container with id bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d Mar 13 13:57:40 crc kubenswrapper[4898]: W0313 13:57:40.088908 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270 WatchSource:0}: Error finding container db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270: Status 404 returned error can't find the container with id db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270 Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111060 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111454 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111472 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111506 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.111519 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213875 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.213911 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.277945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"db6cec27d08f373602b55043f55721f443dc167006efddf537b37281b8349270"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.279068 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bb4404e056ec8640f37be83e005144bfc074c9bdee979e67e580a60d209c935d"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.280776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.280820 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d979c605efac024151111eac4d3ca28abe678c91e8fc936c9ce3917bfd6bfb74"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316577 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316625 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.316661 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.357026 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.357285 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.357248194 +0000 UTC m=+96.358836433 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418641 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418657 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.418666 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458083 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.458319 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458349 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458474 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.458452756 +0000 UTC m=+96.460041005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458487 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458372 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458531 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458546 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458546 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458579 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458591 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458595 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.458573368 +0000 UTC m=+96.460161647 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458628 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.458613159 +0000 UTC m=+96.460201438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.458653 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:41.45864155 +0000 UTC m=+96.460229829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521145 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.521265 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624268 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624311 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.624331 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726647 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726682 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.726697 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.751859 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:40 crc kubenswrapper[4898]: E0313 13:57:40.752085 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.752283 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829256 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829318 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.829328 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931951 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:40 crc kubenswrapper[4898]: I0313 13:57:40.931978 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:40Z","lastTransitionTime":"2026-03-13T13:57:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033731 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033764 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.033797 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136411 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136471 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136519 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.136538 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.238949 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239047 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.239110 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.284806 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.286965 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.287371 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.287488 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.306331 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.322858 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.335213 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.341180 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.351322 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.365983 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.367177 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.367365 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.367339769 +0000 UTC m=+98.368928008 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.378701 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.394542 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.409836 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.423375 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.436283 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443348 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443387 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443396 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.443425 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.450537 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467219 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467911 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.467959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.467999 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468070 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468092 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468103 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468080 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.46806235 +0000 UTC m=+98.469650589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468570 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.468556101 +0000 UTC m=+98.470144340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468601 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468635 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.468624293 +0000 UTC m=+98.470212532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468688 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468700 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468711 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.468741 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:43.468734585 +0000 UTC m=+98.470322824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.488719 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.499326 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:41Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.546959 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547027 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547040 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.547085 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650074 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.650142 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.738823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.739119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.739264 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.739529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.739608 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:41 crc kubenswrapper[4898]: E0313 13:57:41.739762 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.744428 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.745865 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.748747 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.750634 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752466 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752510 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.752529 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.753824 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.754652 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.755531 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.757298 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.758536 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.759941 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.761978 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.764189 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.765930 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.766859 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.768526 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.769309 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.770720 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.771428 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.772249 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.773752 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.774536 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.775326 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.776545 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.777633 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.779385 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.780246 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.781646 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.782425 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.783972 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.784668 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.785323 4898 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.785486 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.789170 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.790350 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.792035 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.794557 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.795514 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.796827 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.797852 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.799878 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.800582 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.802180 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.802916 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.803874 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.804343 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.805411 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.806095 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.807315 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.807802 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.808657 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.809126 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.809675 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.810708 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.811212 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.812002 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854793 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.854829 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.957697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.957772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.957791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.958135 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:41 crc kubenswrapper[4898]: I0313 13:57:41.958171 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:41Z","lastTransitionTime":"2026-03-13T13:57:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060349 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060394 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.060440 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163123 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163132 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.163160 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.265975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266045 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.266056 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368389 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368460 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.368513 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470650 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470705 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470721 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.470771 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574685 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574818 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.574837 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678480 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678497 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.678538 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782297 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782440 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782511 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.782535 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885236 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885315 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885367 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.885392 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988045 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988119 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:42 crc kubenswrapper[4898]: I0313 13:57:42.988159 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:42Z","lastTransitionTime":"2026-03-13T13:57:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090451 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.090494 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193468 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193482 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193499 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.193511 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.294728 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295834 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.295919 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.315970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.340060 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.359951 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.380636 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.384382 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.384750 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.384630343 +0000 UTC m=+102.386218622 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.399510 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.411659 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.428043 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.446997 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.461617 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485616 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485690 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.485712 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485839 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485859 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485871 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485947 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.485930038 +0000 UTC m=+102.487518277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.485984 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486103 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.486077561 +0000 UTC m=+102.487665840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486145 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486262 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486287 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486308 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486360 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.486331407 +0000 UTC m=+102.487919676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.486445 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:47.486430829 +0000 UTC m=+102.488019108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502108 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.502117 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.585839 4898 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605069 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.605111 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713711 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713796 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.713840 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.739205 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.739270 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.739359 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.739371 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.739515 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.739628 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.816927 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.816978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.816995 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.817017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.817034 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919259 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.919425 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977955 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.977974 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:43 crc kubenswrapper[4898]: I0313 13:57:43.978018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:43Z","lastTransitionTime":"2026-03-13T13:57:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:43 crc kubenswrapper[4898]: E0313 13:57:43.996257 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:43Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000576 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000609 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.000628 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.014997 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.018936 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.038688 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.043973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044054 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044081 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.044141 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.066126 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070743 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.070947 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.088377 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:44 crc kubenswrapper[4898]: E0313 13:57:44.088628 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.090963 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091069 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.091112 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194352 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194412 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194455 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.194477 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.297346 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.399946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400009 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400025 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400052 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.400071 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502674 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.502730 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605361 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605402 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605413 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.605443 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708712 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.708787 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811285 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811353 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811374 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.811386 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914462 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914486 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:44 crc kubenswrapper[4898]: I0313 13:57:44.914536 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:44Z","lastTransitionTime":"2026-03-13T13:57:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.016978 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017138 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.017156 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119777 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.119933 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262922 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262947 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.262960 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365638 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365672 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365699 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.365757 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.467870 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571245 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.571292 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.674975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675061 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675088 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675117 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.675139 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.739097 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.739101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.739327 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:45 crc kubenswrapper[4898]: E0313 13:57:45.739322 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:45 crc kubenswrapper[4898]: E0313 13:57:45.739431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:45 crc kubenswrapper[4898]: E0313 13:57:45.739499 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.763462 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778447 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.778500 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.782442 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.796965 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.810749 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.831941 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.847350 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.868537 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880426 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880504 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880525 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.880536 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.883414 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983434 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:45 crc kubenswrapper[4898]: I0313 13:57:45.983464 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:45Z","lastTransitionTime":"2026-03-13T13:57:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086677 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.086880 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190694 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190868 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190940 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.190961 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294015 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.294151 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397742 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.397820 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500788 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500844 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500877 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.500890 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604458 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604512 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604529 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.604540 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707484 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707496 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707512 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.707523 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811011 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811086 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811128 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.811145 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914753 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.914816 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:46Z","lastTransitionTime":"2026-03-13T13:57:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:46 crc kubenswrapper[4898]: I0313 13:57:46.917308 4898 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017825 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017887 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017944 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017971 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.017991 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120499 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.120644 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.222894 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.222975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.222991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.223014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.223031 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325373 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.325412 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.422459 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.422748 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.422729623 +0000 UTC m=+110.424317862 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427335 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427697 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.427783 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524370 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524564 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.524621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.524811 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.524964 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.524889777 +0000 UTC m=+110.526478056 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525109 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525231 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525336 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525250 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.525228095 +0000 UTC m=+110.526816334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525374 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525491 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.52545706 +0000 UTC m=+110.527045339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525514 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525579 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525602 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.525663 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:57:55.525645524 +0000 UTC m=+110.527233843 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530080 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530141 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.530154 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.632921 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633298 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.633798 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737018 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737083 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.737149 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.738547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.738702 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.738713 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.738818 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.738561 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:47 crc kubenswrapper[4898]: E0313 13:57:47.738949 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.840235 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942749 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942829 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942854 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:47 crc kubenswrapper[4898]: I0313 13:57:47.942946 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:47Z","lastTransitionTime":"2026-03-13T13:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046084 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046095 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046120 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.046131 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148482 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148543 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.148554 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251223 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.251265 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354940 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.354995 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.417042 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-xpbhb"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.417522 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.420352 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.420779 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.421566 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.433153 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0427af73-3ee1-4f8b-aa31-915d8ff53e94-hosts-file\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.433227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndhq7\" (UniqueName: \"kubernetes.io/projected/0427af73-3ee1-4f8b-aa31-915d8ff53e94-kube-api-access-ndhq7\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.452578 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457738 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.457773 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.466691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.483186 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.503246 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.515952 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.534423 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0427af73-3ee1-4f8b-aa31-915d8ff53e94-hosts-file\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.534486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndhq7\" (UniqueName: \"kubernetes.io/projected/0427af73-3ee1-4f8b-aa31-915d8ff53e94-kube-api-access-ndhq7\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.534741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0427af73-3ee1-4f8b-aa31-915d8ff53e94-hosts-file\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.535592 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.547465 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.552476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndhq7\" (UniqueName: \"kubernetes.io/projected/0427af73-3ee1-4f8b-aa31-915d8ff53e94-kube-api-access-ndhq7\") pod \"node-resolver-xpbhb\" (UID: \"0427af73-3ee1-4f8b-aa31-915d8ff53e94\") " pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560334 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560410 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560423 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560446 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.560459 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.563385 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.577760 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663034 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663093 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.663123 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.741044 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-xpbhb" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765262 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765278 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.765325 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.803420 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6llfs"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.803961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.804142 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-8k6xj"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.804696 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-5qb65"] Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.805279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.805715 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.809016 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.809380 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.809585 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.810860 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.810971 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811116 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811141 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811410 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811438 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.811764 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.812107 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.812774 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.832936 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837456 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-system-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837492 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27j5r\" (UniqueName: \"kubernetes.io/projected/e527967a-003e-4dbe-aade-d9f882239cb0-kube-api-access-27j5r\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837554 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgmp\" (UniqueName: \"kubernetes.io/projected/e521c857-9711-4f68-886f-38b233d7b05b-kube-api-access-mzgmp\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837604 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-multus-daemon-config\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-multus\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-hostroot\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.837846 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-os-release\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-mcd-auth-proxy-config\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838366 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf5f\" (UniqueName: \"kubernetes.io/projected/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-kube-api-access-kcf5f\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838409 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-cni-binary-copy\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838452 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-kubelet\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-cnibin\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-rootfs\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838603 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-netns\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838655 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-cnibin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-etc-kubernetes\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-proxy-tls\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-socket-dir-parent\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.838890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-k8s-cni-cncf-io\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-bin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839160 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-multus-certs\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839222 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-os-release\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839382 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-conf-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.839545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-system-cni-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.850863 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.863984 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869414 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869430 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.869444 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.880005 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.893819 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.917702 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.934492 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940036 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-os-release\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940080 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf5f\" (UniqueName: \"kubernetes.io/projected/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-kube-api-access-kcf5f\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-cni-binary-copy\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-kubelet\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-cnibin\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940147 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-mcd-auth-proxy-config\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940161 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-rootfs\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940177 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-netns\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940191 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-cnibin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-etc-kubernetes\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940221 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-socket-dir-parent\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-k8s-cni-cncf-io\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-bin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-proxy-tls\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-multus-certs\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-os-release\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940414 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-conf-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940433 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-system-cni-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27j5r\" (UniqueName: \"kubernetes.io/projected/e527967a-003e-4dbe-aade-d9f882239cb0-kube-api-access-27j5r\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-system-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgmp\" (UniqueName: \"kubernetes.io/projected/e521c857-9711-4f68-886f-38b233d7b05b-kube-api-access-mzgmp\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-multus-daemon-config\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940606 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-hostroot\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-multus\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-multus\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.940788 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-os-release\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-socket-dir-parent\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-netns\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-multus-certs\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941111 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-cnibin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941143 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-etc-kubernetes\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941169 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-os-release\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941170 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-cni-bin\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-run-k8s-cni-cncf-io\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-conf-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-cni-binary-copy\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941744 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-host-var-lib-kubelet\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-cnibin\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.941998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-rootfs\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-system-cni-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-hostroot\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942386 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-system-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942375 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-mcd-auth-proxy-config\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e521c857-9711-4f68-886f-38b233d7b05b-multus-cni-dir\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942633 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e527967a-003e-4dbe-aade-d9f882239cb0-cni-binary-copy\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.942646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e521c857-9711-4f68-886f-38b233d7b05b-multus-daemon-config\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.946779 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-proxy-tls\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.950165 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.954787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e527967a-003e-4dbe-aade-d9f882239cb0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.959643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf5f\" (UniqueName: \"kubernetes.io/projected/767eecef-3bc9-4db4-a0cb-5d9c8554c62d-kube-api-access-kcf5f\") pod \"machine-config-daemon-8k6xj\" (UID: \"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\") " pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.960267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27j5r\" (UniqueName: \"kubernetes.io/projected/e527967a-003e-4dbe-aade-d9f882239cb0-kube-api-access-27j5r\") pod \"multus-additional-cni-plugins-5qb65\" (UID: \"e527967a-003e-4dbe-aade-d9f882239cb0\") " pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.964914 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgmp\" (UniqueName: \"kubernetes.io/projected/e521c857-9711-4f68-886f-38b233d7b05b-kube-api-access-mzgmp\") pod \"multus-6llfs\" (UID: \"e521c857-9711-4f68-886f-38b233d7b05b\") " pod="openshift-multus/multus-6llfs" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.968230 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971838 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971924 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971943 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.971982 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:48Z","lastTransitionTime":"2026-03-13T13:57:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.981056 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:48 crc kubenswrapper[4898]: I0313 13:57:48.996497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:48Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.012691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.048009 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.058184 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.070240 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074103 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074144 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074157 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074175 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.074186 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.083102 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.094483 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.105591 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.117404 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.128637 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.133384 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.140189 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: W0313 13:57:49.143486 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767eecef_3bc9_4db4_a0cb_5d9c8554c62d.slice/crio-7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746 WatchSource:0}: Error finding container 7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746: Status 404 returned error can't find the container with id 7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746 Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.146497 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5qb65" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.153225 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.154279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6llfs" Mar 13 13:57:49 crc kubenswrapper[4898]: W0313 13:57:49.158918 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode527967a_003e_4dbe_aade_d9f882239cb0.slice/crio-b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f WatchSource:0}: Error finding container b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f: Status 404 returned error can't find the container with id b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176329 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176392 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176408 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.176450 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.184776 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.186162 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188284 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188462 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188714 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.188783 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.189152 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.189244 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.200981 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.212349 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.232034 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244293 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244360 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244437 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244460 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244488 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244507 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244559 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244578 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244598 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244619 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244664 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.244725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.254396 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.281934 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.281973 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.281988 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.282006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.282020 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.290708 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.315734 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.318378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xpbhb" event={"ID":"0427af73-3ee1-4f8b-aa31-915d8ff53e94","Type":"ContainerStarted","Data":"dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.318548 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-xpbhb" event={"ID":"0427af73-3ee1-4f8b-aa31-915d8ff53e94","Type":"ContainerStarted","Data":"e47906de971555f60254b72cc3296db77b315aa8afb69dc2bdc11926d7fe4f38"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.319763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"c012adc2c459677f7c64d2810bccb2824067ed9f0356d0d528ffe20e674f8d93"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.320730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"7a5d54afb2c298e1aa4a9af903c6e73f9527913eda210875caefada7de43e746"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.321694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"b2e9795f336934fef1373b0978db654178a41d18411372f547a148cefc86f61f"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.335100 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345314 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345394 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345445 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345497 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345521 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345551 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345658 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345703 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345970 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.345998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346030 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346066 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346110 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346144 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346209 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.346388 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.347154 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.347963 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.351463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.362709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"ovnkube-node-qqqs5\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.363786 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.379905 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386629 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386641 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.386673 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.393464 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.406623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.424112 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489531 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489577 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489586 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489600 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.489610 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.512872 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:49 crc kubenswrapper[4898]: W0313 13:57:49.553727 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7d6afc0_d9b5_41b2_a55f_57621c300cbb.slice/crio-064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70 WatchSource:0}: Error finding container 064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70: Status 404 returned error can't find the container with id 064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70 Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593326 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593336 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593373 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.593383 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695462 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695502 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695514 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695535 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.695544 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.739350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:49 crc kubenswrapper[4898]: E0313 13:57:49.739537 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.739599 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:49 crc kubenswrapper[4898]: E0313 13:57:49.739805 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.740014 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:49 crc kubenswrapper[4898]: E0313 13:57:49.740787 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798271 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798312 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.798355 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.900999 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901494 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901507 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901523 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:49 crc kubenswrapper[4898]: I0313 13:57:49.901535 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:49Z","lastTransitionTime":"2026-03-13T13:57:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006010 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006053 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006064 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006082 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.006094 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108356 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108566 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108686 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.108769 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211857 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211869 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.211920 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314275 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.314309 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.327431 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.329875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.329962 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.331432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.333347 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" exitCode=0 Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.333469 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.333572 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.349203 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.367029 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.383092 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.397837 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.410466 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417213 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417254 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417282 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.417294 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.426455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.449081 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.466160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.481535 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.499156 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.515729 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520734 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520749 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.520789 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.531319 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.545840 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.558780 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.570767 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.586201 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.608187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623727 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623766 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.623801 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.627335 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.645206 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.657759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.672177 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.688611 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.706107 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.723529 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726536 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726607 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726665 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.726684 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.741689 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.761044 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:50Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.829990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830042 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.830075 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933379 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933462 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:50 crc kubenswrapper[4898]: I0313 13:57:50.933552 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:50Z","lastTransitionTime":"2026-03-13T13:57:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036333 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036415 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036438 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036463 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.036482 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139696 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.139794 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.241967 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242027 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242063 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.242078 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.338197 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8" exitCode=0 Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.338271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343867 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343953 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343967 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.343995 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345881 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.345937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.358573 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.372412 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.387373 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.400538 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.423939 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.440286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447548 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447578 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447587 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447621 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.447632 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.452440 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.472763 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.489525 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.501167 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.515623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.534206 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.548423 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:51Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550056 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550163 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.550181 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653491 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.653528 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.738814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:51 crc kubenswrapper[4898]: E0313 13:57:51.739096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.739695 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:51 crc kubenswrapper[4898]: E0313 13:57:51.739809 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.739984 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:51 crc kubenswrapper[4898]: E0313 13:57:51.740107 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.756931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.756998 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.757023 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.757100 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.757130 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860262 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860321 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.860380 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963338 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963795 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:51 crc kubenswrapper[4898]: I0313 13:57:51.963856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:51Z","lastTransitionTime":"2026-03-13T13:57:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.066880 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.172997 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173066 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.173127 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276313 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276333 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276355 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.276370 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.351608 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc" exitCode=0 Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.351675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.373770 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379111 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.379146 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.398872 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.423986 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.445980 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.463380 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.479705 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482145 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482221 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.482272 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.495854 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.511700 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.531507 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.546402 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.562774 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.578568 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585212 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585286 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.585298 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.593524 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:52Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688353 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.688770 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.739859 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792834 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792859 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.792875 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898224 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898309 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:52 crc kubenswrapper[4898]: I0313 13:57:52.898366 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:52Z","lastTransitionTime":"2026-03-13T13:57:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000778 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000809 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.000822 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103884 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103931 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103958 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.103978 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.207854 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310790 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310840 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310872 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.310884 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.360699 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4" exitCode=0 Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.360846 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.364563 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.370280 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.372747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.384025 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.407424 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414645 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414690 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.414737 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.433832 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.452221 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.469363 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.484618 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.501559 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.517670 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.519932 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.519984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.519995 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.520016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.520029 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.537927 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.551601 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.563055 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.574796 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.586429 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.609082 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623323 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623382 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623424 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.623440 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.627042 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.648052 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.664386 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.682765 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.698859 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.715667 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726548 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726569 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.726582 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.731031 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.738533 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.738620 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:53 crc kubenswrapper[4898]: E0313 13:57:53.738661 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.738620 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:53 crc kubenswrapper[4898]: E0313 13:57:53.738801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:53 crc kubenswrapper[4898]: E0313 13:57:53.738872 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.753021 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.774058 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.789216 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.804495 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.816891 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:53Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829174 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829187 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829210 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.829224 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932707 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932783 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:53 crc kubenswrapper[4898]: I0313 13:57:53.932792 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:53Z","lastTransitionTime":"2026-03-13T13:57:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036622 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036643 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036667 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.036689 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139022 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139075 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139089 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.139129 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240150 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240199 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.240215 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.254314 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258858 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258911 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258945 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.258956 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.274591 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278631 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278640 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278659 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.278670 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.295078 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299151 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.299186 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.313625 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318812 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318826 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318850 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.318869 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.334047 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: E0313 13:57:54.334288 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336216 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.336273 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.377528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.380293 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c" exitCode=0 Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.380336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.408052 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439416 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439457 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439469 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439489 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439522 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.439742 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.455086 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.467308 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.481936 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.500173 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.519268 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.532714 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543545 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543575 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543642 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.543658 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.544384 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.556543 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.587999 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.618208 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.638921 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:54Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646583 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646629 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646645 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.646655 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750257 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750276 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.750286 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853158 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.853274 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956346 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956369 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:54 crc kubenswrapper[4898]: I0313 13:57:54.956441 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:54Z","lastTransitionTime":"2026-03-13T13:57:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059759 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059961 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.059986 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163048 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163547 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163751 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.163991 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.164160 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.205581 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b46ld"] Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.206447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.209802 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.210329 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.211749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.212129 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.224321 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.248060 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.267919 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.267975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.267990 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.268013 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.268026 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.270449 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.306205 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.318311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgr4p\" (UniqueName: \"kubernetes.io/projected/a1f79182-c06d-47d7-bed8-109c0cc4784e-kube-api-access-fgr4p\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.318739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1f79182-c06d-47d7-bed8-109c0cc4784e-host\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.318973 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1f79182-c06d-47d7-bed8-109c0cc4784e-serviceca\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.329770 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.348146 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.366156 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.371895 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.388581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.399794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.420670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgr4p\" (UniqueName: \"kubernetes.io/projected/a1f79182-c06d-47d7-bed8-109c0cc4784e-kube-api-access-fgr4p\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.421341 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1f79182-c06d-47d7-bed8-109c0cc4784e-host\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.421414 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a1f79182-c06d-47d7-bed8-109c0cc4784e-host\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.421679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1f79182-c06d-47d7-bed8-109c0cc4784e-serviceca\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.423208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a1f79182-c06d-47d7-bed8-109c0cc4784e-serviceca\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.424753 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.444851 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.447948 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgr4p\" (UniqueName: \"kubernetes.io/projected/a1f79182-c06d-47d7-bed8-109c0cc4784e-kube-api-access-fgr4p\") pod \"node-ca-b46ld\" (UID: \"a1f79182-c06d-47d7-bed8-109c0cc4784e\") " pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.460251 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474366 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474376 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.474403 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.479794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.493133 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.505456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.522369 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.522536 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.522506026 +0000 UTC m=+126.524094315 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.524347 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b46ld" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.538253 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: W0313 13:57:55.539247 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1f79182_c06d_47d7_bed8_109c0cc4784e.slice/crio-3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7 WatchSource:0}: Error finding container 3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7: Status 404 returned error can't find the container with id 3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7 Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.554983 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.571510 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576780 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576846 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576856 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.576873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.577255 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.585208 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.600984 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.615690 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623392 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623461 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623497 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.623526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623626 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623658 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623672 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623712 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623730 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.623710038 +0000 UTC m=+126.625298447 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623735 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623756 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623790 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.6237792 +0000 UTC m=+126.625367649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623793 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623864 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.623841111 +0000 UTC m=+126.625429360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623938 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.623977 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:11.623968684 +0000 UTC m=+126.625556943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.637473 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.652406 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.664777 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.678385 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680057 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680141 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.680156 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.690759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.704364 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.719922 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.734625 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.738972 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.738976 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.739080 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.739187 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.739351 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:55 crc kubenswrapper[4898]: E0313 13:57:55.739522 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.754939 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.770627 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783444 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783455 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.783491 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.784609 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.798206 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.819077 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.835229 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.851073 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.869408 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886747 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886789 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886799 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886819 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.886830 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.888668 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.903981 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.916191 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.926810 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.939465 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.958603 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:55Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989472 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989565 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989582 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989612 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:55 crc kubenswrapper[4898]: I0313 13:57:55.989625 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:55Z","lastTransitionTime":"2026-03-13T13:57:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092263 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092320 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.092330 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.194629 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298106 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298118 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.298146 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.397622 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5" exitCode=0 Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.397681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400246 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400258 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400275 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.400292 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.401084 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b46ld" event={"ID":"a1f79182-c06d-47d7-bed8-109c0cc4784e","Type":"ContainerStarted","Data":"cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.401164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b46ld" event={"ID":"a1f79182-c06d-47d7-bed8-109c0cc4784e","Type":"ContainerStarted","Data":"3f25ce8be07a0a1ad63e2498c31080406671dc931ee116f64d6ba7df745afbc7"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.409632 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.410029 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.410048 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.410064 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.438519 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.443593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.443707 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.461088 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.488430 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504762 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504773 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.504805 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.511094 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.541054 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.558979 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.575465 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.591971 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615204 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615685 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615722 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615744 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.615753 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.634726 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.649413 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.663731 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.679422 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.696431 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.712441 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719227 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.719242 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.731059 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.742711 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.764877 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.779478 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.799691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.813448 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822096 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822139 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822152 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822170 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.822185 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.834146 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.853761 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.873729 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.893403 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.910552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.923072 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924840 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924888 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924923 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.924936 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:56Z","lastTransitionTime":"2026-03-13T13:57:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:56 crc kubenswrapper[4898]: I0313 13:57:56.937578 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:56Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028595 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028687 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.028737 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131771 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131814 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131827 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131844 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.131856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235267 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235316 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.235335 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338418 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338476 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338517 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.338533 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.418980 4898 generic.go:334] "Generic (PLEG): container finished" podID="e527967a-003e-4dbe-aade-d9f882239cb0" containerID="dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd" exitCode=0 Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.419095 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerDied","Data":"dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441767 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441824 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441836 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441852 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.441862 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.448800 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.468785 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.488660 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.506299 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.529876 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545232 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545281 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545324 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.545344 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.553450 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.581320 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.600313 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.621029 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.638267 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.650568 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.650863 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.651029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.651153 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.651272 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.661482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.684871 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.702575 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.721220 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:57Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.739519 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.739564 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.739609 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:57 crc kubenswrapper[4898]: E0313 13:57:57.739698 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:57 crc kubenswrapper[4898]: E0313 13:57:57.739890 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:57 crc kubenswrapper[4898]: E0313 13:57:57.740002 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755295 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.755306 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858140 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858601 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858611 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858628 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.858639 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972395 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972407 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972426 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:57 crc kubenswrapper[4898]: I0313 13:57:57.972436 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:57Z","lastTransitionTime":"2026-03-13T13:57:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075915 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075966 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075976 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.075994 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.076006 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178521 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178558 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178588 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.178598 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282492 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282544 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282560 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282581 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.282594 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.384964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385017 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385032 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.385043 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.424591 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" event={"ID":"e527967a-003e-4dbe-aade-d9f882239cb0","Type":"ContainerStarted","Data":"b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.438286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.458118 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.470415 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.482634 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487754 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487786 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487816 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.487830 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.498853 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.524060 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.549738 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.561631 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.571759 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.584612 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591156 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591198 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591212 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591234 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.591247 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.596414 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.609192 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.627045 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.639455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:58Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693713 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.693723 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.796960 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797004 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797029 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.797041 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900495 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900572 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900597 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900627 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:58 crc kubenswrapper[4898]: I0313 13:57:58.900650 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:58Z","lastTransitionTime":"2026-03-13T13:57:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003215 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003253 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003266 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003283 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.003294 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107041 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107155 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107209 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.107231 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210787 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210842 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210857 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.210918 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313189 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313233 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313250 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.313290 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416079 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416122 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416160 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.416179 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.430827 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/0.log" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.434786 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e" exitCode=1 Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.434991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.436236 4898 scope.go:117] "RemoveContainer" containerID="119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.453455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.474273 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.487695 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.503463 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.518979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519078 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519102 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.519121 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.522858 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.539817 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.594055 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.613177 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.621546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.621726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.621862 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.622026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.622150 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.635757 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.655628 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.671635 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.689113 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.702261 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.716959 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:57:59Z is after 2025-08-24T17:21:41Z" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724776 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724873 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.724886 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.738997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.739045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.739047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:57:59 crc kubenswrapper[4898]: E0313 13:57:59.739141 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:57:59 crc kubenswrapper[4898]: E0313 13:57:59.739333 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:57:59 crc kubenswrapper[4898]: E0313 13:57:59.739553 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.827984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828031 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828043 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828060 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.828074 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:57:59 crc kubenswrapper[4898]: I0313 13:57:59.930681 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:57:59Z","lastTransitionTime":"2026-03-13T13:57:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033605 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033650 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033661 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033680 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.033691 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136554 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.136702 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.137085 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241195 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241273 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241296 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241352 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.241373 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344657 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344735 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344762 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344792 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.344813 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.442048 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.442778 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/0.log" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446683 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446695 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446715 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.446729 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.448067 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" exitCode=1 Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.448137 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.448253 4898 scope.go:117] "RemoveContainer" containerID="119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.449560 4898 scope.go:117] "RemoveContainer" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" Mar 13 13:58:00 crc kubenswrapper[4898]: E0313 13:58:00.450028 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.487714 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.523262 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550035 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550091 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550105 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.550147 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.557183 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.576536 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.591160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.605484 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.621528 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.640079 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653485 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653565 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653596 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653626 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.653646 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.656243 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.667357 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.681836 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.695423 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.709654 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.723384 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:00Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755604 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755635 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.755671 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858662 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858745 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858800 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.858821 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962190 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962249 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962265 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962289 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:00 crc kubenswrapper[4898]: I0313 13:58:00.962308 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:00Z","lastTransitionTime":"2026-03-13T13:58:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065530 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065579 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065593 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065611 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.065622 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168817 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168855 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168866 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168883 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.168915 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.226262 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt"] Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.227127 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.230769 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.231801 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.249260 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.262891 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271518 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271589 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271619 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.271638 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.276782 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.290740 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296308 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296401 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.296435 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjttd\" (UniqueName: \"kubernetes.io/projected/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-kube-api-access-gjttd\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.305492 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.330747 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.344610 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.365234 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374677 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374768 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374805 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.374823 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.380510 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397657 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.397691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjttd\" (UniqueName: \"kubernetes.io/projected/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-kube-api-access-gjttd\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.398300 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.398418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.404111 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.405159 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.413748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjttd\" (UniqueName: \"kubernetes.io/projected/33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3-kube-api-access-gjttd\") pod \"ovnkube-control-plane-749d76644c-wh2lt\" (UID: \"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.420869 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.434456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.449254 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.453478 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.466972 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.477925 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.477992 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.478002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.478016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.478027 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.480242 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:01Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.545836 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" Mar 13 13:58:01 crc kubenswrapper[4898]: W0313 13:58:01.570452 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fb2f7c_9abf_45e0_af55_e8f7c09c2dc3.slice/crio-54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d WatchSource:0}: Error finding container 54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d: Status 404 returned error can't find the container with id 54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584299 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584663 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584719 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.584746 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688341 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688391 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688405 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688432 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.688449 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.739595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.739609 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.739795 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.739745 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.739947 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.740145 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791363 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791386 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.791403 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895180 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895196 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895219 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.895236 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.993160 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fwrwc"] Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.993706 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:01 crc kubenswrapper[4898]: E0313 13:58:01.993767 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998801 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998831 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998841 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:01 crc kubenswrapper[4898]: I0313 13:58:01.998911 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:01Z","lastTransitionTime":"2026-03-13T13:58:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.004463 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9k7q\" (UniqueName: \"kubernetes.io/projected/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-kube-api-access-l9k7q\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.004508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.013825 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.028350 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.041187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.054365 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.071433 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.094683 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.101354 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.101491 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.101741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.102014 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.102211 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.105875 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9k7q\" (UniqueName: \"kubernetes.io/projected/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-kube-api-access-l9k7q\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.105934 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.106096 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.106156 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:02.606141123 +0000 UTC m=+117.607729362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.113645 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.128454 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9k7q\" (UniqueName: \"kubernetes.io/projected/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-kube-api-access-l9k7q\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.129740 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.140919 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.165282 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.180115 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.193984 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205172 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205242 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205272 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205673 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.205873 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.222398 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.240088 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.254119 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.269478 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309486 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309724 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309845 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.309975 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.310168 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414003 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414070 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414094 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414125 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.414148 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.465580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" event={"ID":"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3","Type":"ContainerStarted","Data":"421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.465953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" event={"ID":"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3","Type":"ContainerStarted","Data":"73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.466044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" event={"ID":"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3","Type":"ContainerStarted","Data":"54ed36a1cbac66f502e27d143ae1755d96901a5e3094426f4f3095a6102fc60d"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.486286 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.506609 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518036 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518112 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518137 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518166 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.518188 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.533958 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.549477 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.566718 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.578630 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.591220 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.602712 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.611621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.611877 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: E0313 13:58:02.612017 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:03.611999929 +0000 UTC m=+118.613588168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.616103 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620149 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620168 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620193 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.620212 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.625970 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.637235 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.651813 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.664695 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.678883 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.706565 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723169 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723211 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723222 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.723257 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.727299 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:02Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825511 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825570 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825590 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825616 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.825635 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928692 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928755 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928774 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928797 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:02 crc kubenswrapper[4898]: I0313 13:58:02.928811 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:02Z","lastTransitionTime":"2026-03-13T13:58:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.031876 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.031979 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.032004 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.032035 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.032058 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134452 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134516 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134533 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134556 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.134575 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237660 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237718 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237730 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237750 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.237763 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340143 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340192 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340205 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.340214 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443136 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443147 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443164 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.443176 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545791 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545823 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545832 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.545856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.630421 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.630697 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.630822 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:05.630756901 +0000 UTC m=+120.632345170 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649176 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649267 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649290 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649324 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.649348 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.738837 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.739116 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.739176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.739263 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.739262 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.741203 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.741290 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:03 crc kubenswrapper[4898]: E0313 13:58:03.741787 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753277 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753327 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753344 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753368 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.753385 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.754286 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.856651 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857026 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857142 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857229 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.857300 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.960737 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961098 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961231 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961404 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:03 crc kubenswrapper[4898]: I0313 13:58:03.961602 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:03Z","lastTransitionTime":"2026-03-13T13:58:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.064889 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065218 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065306 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065398 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.065476 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169099 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169165 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169179 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169206 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.169227 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272130 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272183 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272204 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.272219 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376725 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376765 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376781 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.376791 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.479684 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480040 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480207 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480302 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.480381 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584214 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584552 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584573 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584599 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.584617 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646002 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646068 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646110 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.646127 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.667867 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673656 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673726 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673739 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673772 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.673827 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.694138 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.699910 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.699972 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.699984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.700005 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.700018 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.720979 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727293 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727343 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727357 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727377 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.727394 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.745063 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.749914 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.749969 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.749984 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.750007 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.750020 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.765793 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:04Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:04 crc kubenswrapper[4898]: E0313 13:58:04.766053 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768675 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768741 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768757 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768784 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.768804 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871555 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871621 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871633 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871653 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.871666 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.977989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978055 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978085 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:04 crc kubenswrapper[4898]: I0313 13:58:04.978136 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:04Z","lastTransitionTime":"2026-03-13T13:58:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081375 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081477 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081501 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081534 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.081559 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184330 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184420 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184439 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184465 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.184484 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288328 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288390 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288403 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288422 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.288435 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.391879 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.391968 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.391989 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.392016 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.392034 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494429 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494456 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494466 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494483 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.494496 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597554 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597618 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597637 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597665 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.597684 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:05Z","lastTransitionTime":"2026-03-13T13:58:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.654474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.654675 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.654770 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:09.654752497 +0000 UTC m=+124.656340746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.698129 4898 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.739332 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.739449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.739800 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.739952 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.740083 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.740261 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.740462 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.740984 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.759857 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.775887 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.791613 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.805805 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.827687 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.853608 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: E0313 13:58:05.867793 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.874451 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.895565 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.917719 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.943771 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.957210 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.973648 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:05 crc kubenswrapper[4898]: I0313 13:58:05.988169 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:05Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.003497 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.018160 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.033385 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.054256 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.401848 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.411200 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.420202 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.429028 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.439543 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.452623 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.484298 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.496724 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.512620 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.523111 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.536926 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.558093 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://119d07b211c82506e4dbdf9d44df46ea5c76a0b52e5a4c5effe24306c70dac7e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:57:59Z\\\",\\\"message\\\":\\\"tes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0313 13:57:59.029739 6719 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 13:57:59.030788 6719 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 13:57:59.030840 6719 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 13:57:59.030855 6719 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 13:57:59.030975 6719 factory.go:656] Stopping watch factory\\\\nI0313 13:57:59.031099 6719 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 13:57:59.031118 6719 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 13:57:59.031129 6719 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 13:57:59.031126 6719 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0313 13:57:59.031137 6719 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 13:57:59.031251 6719 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0313 13:57:59.031254 6719 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.571388 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.584072 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.595087 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.606911 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.618441 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:06 crc kubenswrapper[4898]: I0313 13:58:06.629252 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:06Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739098 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739154 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739209 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739375 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:07 crc kubenswrapper[4898]: I0313 13:58:07.739417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739546 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739637 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:07 crc kubenswrapper[4898]: E0313 13:58:07.739961 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.703152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.703391 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.703495 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:17.703469547 +0000 UTC m=+132.705057826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739092 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:09 crc kubenswrapper[4898]: I0313 13:58:09.739123 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739338 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739449 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739736 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:09 crc kubenswrapper[4898]: E0313 13:58:09.739837 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:10 crc kubenswrapper[4898]: E0313 13:58:10.869944 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.622638 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.622952 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.622872765 +0000 UTC m=+158.624461044 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.723855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.723984 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724082 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724062076 +0000 UTC m=+158.725650325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.723983 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724119 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724157 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724176 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724130 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724122128 +0000 UTC m=+158.725710377 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.723997 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724293 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724270221 +0000 UTC m=+158.725858490 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724320 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724340 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.724395 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:43.724381314 +0000 UTC m=+158.725969653 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739614 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.739739 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:11 crc kubenswrapper[4898]: I0313 13:58:11.739759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.739945 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.740074 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:11 crc kubenswrapper[4898]: E0313 13:58:11.740140 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739224 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740222 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739518 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.739298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740370 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740306 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:13 crc kubenswrapper[4898]: E0313 13:58:13.740561 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.742708 4898 scope.go:117] "RemoveContainer" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.777219 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.795783 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.817400 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.831424 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.846943 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.881343 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.904981 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.919605 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.933709 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.951698 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.970189 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.985634 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:13 crc kubenswrapper[4898]: I0313 13:58:13.997511 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:13Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.008640 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.019378 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.032006 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.045828 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.515748 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.518973 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc"} Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.519507 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.537588 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.552418 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.565243 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.577456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.592482 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.612026 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.625351 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.640388 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.655560 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.681297 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.696083 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.709542 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.726944 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.741792 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.760128 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.782193 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.800889 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879039 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879097 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879113 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.879147 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.893054 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898689 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898739 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898758 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898782 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.898800 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.919759 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925006 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925058 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925072 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925092 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.925106 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.939423 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943235 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943247 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943264 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.943278 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.957210 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961613 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961646 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961655 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:14 crc kubenswrapper[4898]: I0313 13:58:14.961682 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:14Z","lastTransitionTime":"2026-03-13T13:58:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.975607 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:14Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:14 crc kubenswrapper[4898]: E0313 13:58:14.975834 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.525318 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/2.log" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.527092 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/1.log" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.531583 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" exitCode=1 Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.531631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc"} Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.531670 4898 scope.go:117] "RemoveContainer" containerID="14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.532645 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.532940 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.552221 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.566840 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.581507 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.595461 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.616333 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.629212 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.642410 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.652328 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.664122 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.678283 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.696599 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.707067 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.720188 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.734743 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738548 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.738549 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738595 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738721 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738811 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.738997 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.769724 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.781356 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.794345 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.806396 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.816703 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.828552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.841148 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.866393 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: E0313 13:58:15.872448 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.920325 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.936552 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.948971 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.962212 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:15 crc kubenswrapper[4898]: I0313 13:58:15.975007 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:15Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.003263 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14175f8dff64bc803d0edb3738c3c781867aa64407c7b7b8be708ab08e220d4f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:00Z\\\",\\\"message\\\":\\\"ection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 13:58:00.362727 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-image-registry/image-registry\\\\\\\"}\\\\nI0313 13:58:00.362778 6919 services_controller.go:360] Finished syncing service image-registry on namespace openshift-image-registry for network=default : 2.065295ms\\\\nI0313 13:58:00.362859 6919 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-cluster-version/cluster-version-operator\\\\\\\"}\\\\nI0313 13:58:00.362895 6919 services_controller.go:360] Finished syncing service cluster-version-operator on namespace openshift-cluster-version for network=default : 2.427204ms\\\\nI0313 13:58:00.362944 6919 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 13:58:00.363141 6919 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 13:58:00.363206 6919 ovnkube.go:599] Stopped ovnkube\\\\nI0313 13:58:00.363241 6919 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 13:58:00.363329 6919 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.021719 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.036684 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.051528 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.063794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.075003 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.084813 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.537455 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/2.log" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.541449 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:16 crc kubenswrapper[4898]: E0313 13:58:16.541691 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.564362 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.579737 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.597745 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.611996 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.626215 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.657504 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.679018 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.698935 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.719962 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.735803 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.750191 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.763138 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.779554 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.796471 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.812225 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.834011 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:16 crc kubenswrapper[4898]: I0313 13:58:16.858558 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:16Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.708184 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.708393 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.708523 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:58:33.708491293 +0000 UTC m=+148.710079572 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739228 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739277 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739316 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:17 crc kubenswrapper[4898]: I0313 13:58:17.739253 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739369 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739462 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739714 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:17 crc kubenswrapper[4898]: E0313 13:58:17.739808 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739338 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739416 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739348 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.739529 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:19 crc kubenswrapper[4898]: I0313 13:58:19.739588 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.739724 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.739828 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:19 crc kubenswrapper[4898]: E0313 13:58:19.740004 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:20 crc kubenswrapper[4898]: E0313 13:58:20.874013 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739160 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739364 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:21 crc kubenswrapper[4898]: I0313 13:58:21.739419 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739504 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739560 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:21 crc kubenswrapper[4898]: E0313 13:58:21.739589 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739452 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739504 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739585 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.739791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:23 crc kubenswrapper[4898]: I0313 13:58:23.739832 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.740005 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.740161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:23 crc kubenswrapper[4898]: E0313 13:58:23.740316 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111526 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111666 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111691 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111719 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.111742 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.132681 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.137853 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.137942 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.137982 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.138008 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.138026 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.156231 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160557 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160632 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160649 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160671 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.160687 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.173742 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177752 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177810 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177821 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177839 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.177853 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.196877 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201654 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201703 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201720 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201740 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.201754 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:25Z","lastTransitionTime":"2026-03-13T13:58:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.220692 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.221018 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.738988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.739038 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.739175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739296 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.739550 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739724 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739806 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.739891 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.765309 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.782690 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.798142 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.815648 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.829919 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.843238 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.859371 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: E0313 13:58:25.874458 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.878232 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.901551 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.935828 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.954043 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.974242 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:25 crc kubenswrapper[4898]: I0313 13:58:25.993091 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:25Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.013010 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.044383 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.065076 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.084798 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:26Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:26 crc kubenswrapper[4898]: I0313 13:58:26.754463 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739487 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:27 crc kubenswrapper[4898]: I0313 13:58:27.739619 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.739871 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.740028 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.740161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:27 crc kubenswrapper[4898]: E0313 13:58:27.740343 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:28 crc kubenswrapper[4898]: I0313 13:58:28.741071 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:28 crc kubenswrapper[4898]: E0313 13:58:28.741417 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739204 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739374 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.739506 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740000 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740232 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:29 crc kubenswrapper[4898]: E0313 13:58:29.740569 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:29 crc kubenswrapper[4898]: I0313 13:58:29.756805 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 13:58:30 crc kubenswrapper[4898]: E0313 13:58:30.876123 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739231 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739280 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739273 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739401 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:31 crc kubenswrapper[4898]: I0313 13:58:31.739523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739518 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739565 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:31 crc kubenswrapper[4898]: E0313 13:58:31.739619 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739148 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739256 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739188 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739332 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739387 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.739449 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739674 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.739740 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:33 crc kubenswrapper[4898]: I0313 13:58:33.783187 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.783407 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:33 crc kubenswrapper[4898]: E0313 13:58:33.783520 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:05.783495481 +0000 UTC m=+180.785083730 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266087 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266148 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266167 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266191 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.266208 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.286991 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290775 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290847 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290860 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.290869 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.303052 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306475 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306532 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306546 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306564 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.306597 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.318969 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322503 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322571 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322585 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322603 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.322615 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.335508 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339252 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339279 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339288 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339301 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.339311 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:35Z","lastTransitionTime":"2026-03-13T13:58:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.351609 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.351717 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739034 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739098 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739201 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.739261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739353 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739467 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.739550 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.755384 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.772524 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.790508 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.808784 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.831340 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.847500 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: E0313 13:58:35.876647 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.883287 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.903252 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.919629 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.934607 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.965333 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:35 crc kubenswrapper[4898]: I0313 13:58:35.993830 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:35Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.008626 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.025187 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.045197 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.061161 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.078953 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.095562 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:36 crc kubenswrapper[4898]: I0313 13:58:36.110794 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:36Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618107 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/0.log" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618183 4898 generic.go:334] "Generic (PLEG): container finished" podID="e521c857-9711-4f68-886f-38b233d7b05b" containerID="de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f" exitCode=1 Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerDied","Data":"de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f"} Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.618680 4898 scope.go:117] "RemoveContainer" containerID="de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.636649 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.655601 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.677979 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.695737 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.710263 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.719986 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.734034 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.739684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.739707 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.739780 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.740041 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740267 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740515 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740628 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:37 crc kubenswrapper[4898]: E0313 13:58:37.740772 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.749762 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.763039 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.776377 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.790873 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.822381 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.887943 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.908327 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.925182 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.947541 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.972708 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:37 crc kubenswrapper[4898]: I0313 13:58:37.997329 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:37Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.015691 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.622816 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/0.log" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.622933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770"} Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.639468 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.656723 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.682878 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.699043 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.722250 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.737937 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.754076 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.773741 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.794079 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.815739 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.839344 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.871445 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.890873 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.910237 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.928757 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.943128 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.967296 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.981551 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:38 crc kubenswrapper[4898]: I0313 13:58:38.994213 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:38Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739257 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739369 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739446 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:39 crc kubenswrapper[4898]: I0313 13:58:39.739459 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739595 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739691 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:39 crc kubenswrapper[4898]: E0313 13:58:39.739775 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:40 crc kubenswrapper[4898]: E0313 13:58:40.877831 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739009 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739075 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739179 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739282 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739522 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:41 crc kubenswrapper[4898]: I0313 13:58:41.739569 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739717 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:41 crc kubenswrapper[4898]: E0313 13:58:41.739869 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.693058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.693246 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.693212076 +0000 UTC m=+222.694800355 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739379 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739500 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.739703 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.739817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.739845 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.740285 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.740448 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.741074 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795141 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795374 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795407 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795422 4898 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795478 4898 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795491 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795474156 +0000 UTC m=+222.797062405 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795495 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795542 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795523107 +0000 UTC m=+222.797111386 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795518 4898 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795574 4898 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795622 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795606429 +0000 UTC m=+222.797194708 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 13:58:43 crc kubenswrapper[4898]: I0313 13:58:43.795398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795678 4898 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:43 crc kubenswrapper[4898]: E0313 13:58:43.795846 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 13:59:47.795803544 +0000 UTC m=+222.797391893 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.644431 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/3.log" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.645357 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/2.log" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.649195 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" exitCode=1 Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.649237 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022"} Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.649267 4898 scope.go:117] "RemoveContainer" containerID="c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.650436 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:58:44 crc kubenswrapper[4898]: E0313 13:58:44.650705 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.667469 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.681865 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.694811 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.710824 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.736343 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:44Z\\\",\\\"message\\\":\\\"94 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-fwrwc\\\\nI0313 13:58:44.575642 7494 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-fwrwc in node crc\\\\nI0313 13:58:44.575664 7494 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-fwrwc] creating logical port openshift-multus_network-metrics-daemon-fwrwc for pod on switch crc\\\\nI0313 13:58:44.575715 7494 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0313 13:58:44.575730 7494 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 2.852375ms\\\\nI0313 13:58:44.575713 7494 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF0313 13:58:44.574595 7494 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.752811 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.772456 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.788455 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.805264 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.820627 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.834291 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.852022 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.864047 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.875382 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.886667 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.903458 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.923761 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.943310 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:44 crc kubenswrapper[4898]: I0313 13:58:44.964363 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:44Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557203 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557240 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557248 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557260 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.557270 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.576160 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579779 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579822 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579837 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.579856 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.597661 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602188 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602251 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602274 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602306 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.602332 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.620881 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625304 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625340 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625350 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625362 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.625372 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.637270 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640878 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640933 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640946 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640964 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.640972 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:45Z","lastTransitionTime":"2026-03-13T13:58:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.651834 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6587b7f7-4682-47cc-be02-888912bc905d\\\",\\\"systemUUID\\\":\\\"5908a3c1-cceb-4f4a-af76-6b5ef150f486\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.652020 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.655468 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/3.log" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739538 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.739708 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.739809 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.740101 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.740437 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.740860 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.757882 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.769023 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.783090 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.799138 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.816833 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.838324 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c8122355233f3be13689264ea41f8ede2b7e4d07e88bd6a2a2d5d6fd1a166bcc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:14Z\\\",\\\"message\\\":\\\"Template:(*services.Template)(nil)}}}}, Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string(nil), Groups:[]string(nil)}}\\\\nI0313 13:58:14.697412 7170 services_controller.go:452] Built service openshift-console-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697420 7170 services_controller.go:453] Built service default/kubernetes template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697429 7170 services_controller.go:454] Service default/kubernetes for network=default has 0 cluster-wide, 1 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697434 7170 services_controller.go:453] Built service openshift-console-operator/metrics template LB for network=default: []services.LB{}\\\\nI0313 13:58:14.697450 7170 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nI0313 13:58:14.697454 7170 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0313 13:58:14.697478 7170 services_controller.go:443] Built service openshift-authentication/oauth-openshift LB cluster-wide configs for netw\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:13Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:44Z\\\",\\\"message\\\":\\\"94 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-fwrwc\\\\nI0313 13:58:44.575642 7494 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-fwrwc in node crc\\\\nI0313 13:58:44.575664 7494 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-fwrwc] creating logical port openshift-multus_network-metrics-daemon-fwrwc for pod on switch crc\\\\nI0313 13:58:44.575715 7494 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0313 13:58:44.575730 7494 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 2.852375ms\\\\nI0313 13:58:44.575713 7494 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF0313 13:58:44.574595 7494 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.858779 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.876374 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: E0313 13:58:45.878491 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.891059 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.904851 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.917470 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.936231 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.954404 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.968161 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.984261 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:45 crc kubenswrapper[4898]: I0313 13:58:45.996293 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:45Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:46 crc kubenswrapper[4898]: I0313 13:58:46.022603 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:46Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:46 crc kubenswrapper[4898]: I0313 13:58:46.039654 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:46Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:46 crc kubenswrapper[4898]: I0313 13:58:46.056815 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:46Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739004 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739106 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.739261 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:47 crc kubenswrapper[4898]: I0313 13:58:47.739531 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.739643 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.739853 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:47 crc kubenswrapper[4898]: E0313 13:58:47.740151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.513213 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.514627 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.514999 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.543143 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-5qb65" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e527967a-003e-4dbe-aade-d9f882239cb0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6e0093b9e7670d289e34bc225cf1650906fdaf57e7d3c83ab8897fd0eed7204\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04812419069b903d624f82aed2bf728efd770e67fc07cc6a0d71c13b9be9a2b8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52b052e24adedd6a8908d009b73a28486844b3f6ce033583d5e0c580d8df8bdc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f70e1c4819a418bb37bdc10f4f0296f6d7429bf52ff7c3aae5ad72b2183f50f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae3d8e1f00ae9e1549f37d69ca676f5c80a159bcc7b6946859fa63037c20e25c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://54e65184749c8766383535e9641c0ae122776e88aec3906197a0fbc3a1b65ac5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dc01f3c81e1d1f177cc35911b98b7ff71c40082dfcf9a23d45ae7b6aaffa29cd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27j5r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-5qb65\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.562250 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b46ld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1f79182-c06d-47d7-bed8-109c0cc4784e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfa39cb1a5f792a575519b3616cff170f5e303a1bf05b207578725aa1711117b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b46ld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.579176 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33fb2f7c-9abf-45e0-af55-e8f7c09c2dc3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73b8846971b589f5619b239746bd2f5953d12af7f3fa6543042da89561930dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://421d50fcd0a69c2b53067ae09bbea100b532174c1d76f641c79c58a5fa3f9a3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjttd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wh2lt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.592492 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l9k7q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:58:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fwrwc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.605174 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d5bd6119-d7a3-4363-add5-7eb62180e1ab\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://00526d41ebb304bac57e24a91007919427bb623e8cbce6cc25d7b1a5195871a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac7ea5df99bb3004d7582910739297c1ee1913de5399a5be9391bbf4c96be32f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:04Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 13:56:35.870763 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 13:56:35.872344 1 observer_polling.go:159] Starting file observer\\\\nI0313 13:56:35.873967 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 13:56:35.875073 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 13:57:02.288879 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0313 13:57:04.933123 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 13:57:04.933351 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:35Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2524f0ae02e7d3c2569744846e2d99e8c808c9e14afeab6415910ee731794609\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://511cb3e4e8a16bcdfdf5840917a0a9a5a2e2a81419278449f613c9a41d164c1b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.619301 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddf6378f2d97ece521fb3cb0dccd8ce161d6e00f634fa76560cf8c0b090e4c22\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8d4cb87c509a5d6fdc02c50d3282376024898f6fd6cfc94c81c53763ac968ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.631792 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6ccb77cf4fc033c7377094ae8b85b14135043e4c9ef12e85a7be5ee9e56b8fac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.657459 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc4f3def-518e-420d-95b2-e9d7f3caf3b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://088f42f6f19b31579419481228562c142aa1480795b6ffa4bd1cba05cb35cb84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad2720007bcc3ac54674e20042cdbbe64be6cd24b9556c841e609d541065f883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d062f36316872a1ce1d211650b661771b3ef7ff25e420a9db22c3191160fa9a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://157541b623c7dc0eeb157191f93fb07af61fb66891ac3231f90f14690179cb32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffea82d1334de10deacc44ca75008d2dc4f59ca58355bd166f2034d24a0f5cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a978acef46180888ee2cfa53a17e4a8e06c392d2fa21bf0190afb50716e871e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ecf25e1952ff7af13964501aa48d34e5883ac9891b4297b619b7dd5a633bfd6f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ea6d880a8f20e45904ee3cb8f012dab91ab486e598a4b432f9d60624f5c4416\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.675549 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a3a9fb7c-9705-43b2-a2d8-663d19d50cda\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0be461eabdbb6c2414f8a9805c9537cea8595a509f241a72015802159baaa9b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://612d879ff268a2130e16ebb42a2e402a0e2d7ef04248322761b873bc6fe026c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f7a5f6c4c1f3ccd5b10a77ca72055844d63966f39e14fbba46d8b6074f4d7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8772cfcf785db35cec05c4acb35d750d63feee0a1b192a79fd2f9d3f9b33c433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.692876 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.708001 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.729673 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6llfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e521c857-9711-4f68-886f-38b233d7b05b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:36Z\\\",\\\"message\\\":\\\"2026-03-13T13:57:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61\\\\n2026-03-13T13:57:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c0887954-cd2c-445a-b95c-6f7339f19f61 to /host/opt/cni/bin/\\\\n2026-03-13T13:57:51Z [verbose] multus-daemon started\\\\n2026-03-13T13:57:51Z [verbose] Readiness Indicator file check\\\\n2026-03-13T13:58:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:58:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mzgmp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6llfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740270 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740273 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.740413 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740541 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740578 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:49 crc kubenswrapper[4898]: E0313 13:58:49.740652 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.754637 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T13:58:44Z\\\",\\\"message\\\":\\\"94 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-fwrwc\\\\nI0313 13:58:44.575642 7494 ovn.go:134] Ensuring zone local for Pod openshift-multus/network-metrics-daemon-fwrwc in node crc\\\\nI0313 13:58:44.575664 7494 base_network_controller_pods.go:477] [default/openshift-multus/network-metrics-daemon-fwrwc] creating logical port openshift-multus_network-metrics-daemon-fwrwc for pod on switch crc\\\\nI0313 13:58:44.575715 7494 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-dns/dns-default\\\\\\\"}\\\\nI0313 13:58:44.575730 7494 services_controller.go:360] Finished syncing service dns-default on namespace openshift-dns for network=default : 2.852375ms\\\\nI0313 13:58:44.575713 7494 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nF0313 13:58:44.574595 7494 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:58:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tc944\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qqqs5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.772358 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3dea03f2-846c-4fe2-91f3-67269c416048\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:58:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T13:57:02Z\\\",\\\"message\\\":\\\"W0313 13:57:02.050410 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0313 13:57:02.050746 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773410222 cert, and key in /tmp/serving-cert-1986146747/serving-signer.crt, /tmp/serving-cert-1986146747/serving-signer.key\\\\nI0313 13:57:02.414022 1 observer_polling.go:159] Starting file observer\\\\nW0313 13:57:02.427000 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0313 13:57:02.427106 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 13:57:02.427590 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1986146747/tls.crt::/tmp/serving-cert-1986146747/tls.key\\\\\\\"\\\\nF0313 13:57:02.783455 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T13:57:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.789204 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71c55972-53a1-4a22-85ab-c38bffbaf629\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3cc01a31ae51aefc4cc7ea6a77eea7d0885ebf0f9fdcbdfbeebfcc7e69a33755\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:56:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c4971276739c6cece5f7d6ad57da8c9e6d67d9ffe7c85e22d1d78344045c08d1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T13:56:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T13:56:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:56:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.800710 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-xpbhb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0427af73-3ee1-4f8b-aa31-915d8ff53e94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd8b394eb4b9a2d5b8a55dfe9073e48789bb72fb2dab87f76f41c48c1714feab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ndhq7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-xpbhb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.813219 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"767eecef-3bc9-4db4-a0cb-5d9c8554c62d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a81ec4e1d92c8ebd60a1e31e39fc5a639e7561e63cdd9ec99ada92fe3896522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kcf5f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T13:57:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-8k6xj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.829739 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:49 crc kubenswrapper[4898]: I0313 13:58:49.843443 4898 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T13:57:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5c10c6f45eca4a8ca925bec7ac016afa547a48c7c6d20852bed5f399db8351a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T13:57:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T13:58:49Z is after 2025-08-24T17:21:41Z" Mar 13 13:58:50 crc kubenswrapper[4898]: E0313 13:58:50.879612 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739180 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739194 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:51 crc kubenswrapper[4898]: I0313 13:58:51.739290 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.739413 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.739772 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.740329 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:51 crc kubenswrapper[4898]: E0313 13:58:51.740195 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.738962 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.739008 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.739621 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.739119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:53 crc kubenswrapper[4898]: I0313 13:58:53.739064 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.739759 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.739848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:53 crc kubenswrapper[4898]: E0313 13:58:53.740050 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739196 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739207 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.739254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739362 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739408 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.739731 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.774956 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=75.774940353 podStartE2EDuration="1m15.774940353s" podCreationTimestamp="2026-03-13 13:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.774518494 +0000 UTC m=+170.776106753" watchObservedRunningTime="2026-03-13 13:58:55.774940353 +0000 UTC m=+170.776528592" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.793290 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.793268955 podStartE2EDuration="29.793268955s" podCreationTimestamp="2026-03-13 13:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.792947707 +0000 UTC m=+170.794536016" watchObservedRunningTime="2026-03-13 13:58:55.793268955 +0000 UTC m=+170.794857224" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840065 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840104 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840115 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840133 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.840144 4898 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T13:58:55Z","lastTransitionTime":"2026-03-13T13:58:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 13:58:55 crc kubenswrapper[4898]: E0313 13:58:55.881363 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.895726 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6llfs" podStartSLOduration=104.895702889 podStartE2EDuration="1m44.895702889s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.848157416 +0000 UTC m=+170.849745665" watchObservedRunningTime="2026-03-13 13:58:55.895702889 +0000 UTC m=+170.897291128" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.895973 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv"] Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.896345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.898360 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.898525 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.899857 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.900344 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb107ebf-df14-44ee-8c21-06fd3c080f7b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb107ebf-df14-44ee-8c21-06fd3c080f7b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914622 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914664 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb107ebf-df14-44ee-8c21-06fd3c080f7b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.914689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.974998 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podStartSLOduration=104.97498271 podStartE2EDuration="1m44.97498271s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.97452888 +0000 UTC m=+170.976117129" watchObservedRunningTime="2026-03-13 13:58:55.97498271 +0000 UTC m=+170.976570949" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.975328 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-xpbhb" podStartSLOduration=104.975323408 podStartE2EDuration="1m44.975323408s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.957170601 +0000 UTC m=+170.958758840" watchObservedRunningTime="2026-03-13 13:58:55.975323408 +0000 UTC m=+170.976911637" Mar 13 13:58:55 crc kubenswrapper[4898]: I0313 13:58:55.987260 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=26.987250022 podStartE2EDuration="26.987250022s" podCreationTimestamp="2026-03-13 13:58:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:55.987133989 +0000 UTC m=+170.988722248" watchObservedRunningTime="2026-03-13 13:58:55.987250022 +0000 UTC m=+170.988838261" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015338 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb107ebf-df14-44ee-8c21-06fd3c080f7b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015419 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015450 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb107ebf-df14-44ee-8c21-06fd3c080f7b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015525 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb107ebf-df14-44ee-8c21-06fd3c080f7b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.015885 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/fb107ebf-df14-44ee-8c21-06fd3c080f7b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.016479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fb107ebf-df14-44ee-8c21-06fd3c080f7b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.024049 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb107ebf-df14-44ee-8c21-06fd3c080f7b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.035019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fb107ebf-df14-44ee-8c21-06fd3c080f7b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fthwv\" (UID: \"fb107ebf-df14-44ee-8c21-06fd3c080f7b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.042525 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5qb65" podStartSLOduration=105.042503462 podStartE2EDuration="1m45.042503462s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.04156458 +0000 UTC m=+171.043152859" watchObservedRunningTime="2026-03-13 13:58:56.042503462 +0000 UTC m=+171.044091721" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.065844 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b46ld" podStartSLOduration=105.065826228 podStartE2EDuration="1m45.065826228s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.055531361 +0000 UTC m=+171.057119640" watchObservedRunningTime="2026-03-13 13:58:56.065826228 +0000 UTC m=+171.067414467" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.076338 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wh2lt" podStartSLOduration=105.076319249 podStartE2EDuration="1m45.076319249s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.065980071 +0000 UTC m=+171.067568320" watchObservedRunningTime="2026-03-13 13:58:56.076319249 +0000 UTC m=+171.077907488" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.102615 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=75.102598923 podStartE2EDuration="1m15.102598923s" podCreationTimestamp="2026-03-13 13:57:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.102018099 +0000 UTC m=+171.103606348" watchObservedRunningTime="2026-03-13 13:58:56.102598923 +0000 UTC m=+171.104187162" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.114736 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.114709171 podStartE2EDuration="53.114709171s" podCreationTimestamp="2026-03-13 13:58:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:56.113532214 +0000 UTC m=+171.115120473" watchObservedRunningTime="2026-03-13 13:58:56.114709171 +0000 UTC m=+171.116297440" Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.212269 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" Mar 13 13:58:56 crc kubenswrapper[4898]: W0313 13:58:56.231072 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb107ebf_df14_44ee_8c21_06fd3c080f7b.slice/crio-392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0 WatchSource:0}: Error finding container 392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0: Status 404 returned error can't find the container with id 392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0 Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.703576 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" event={"ID":"fb107ebf-df14-44ee-8c21-06fd3c080f7b","Type":"ContainerStarted","Data":"bb93c63d790b280fdde81d552598d5784527fe6696948be6b926c2c0aeceb7e5"} Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.703662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" event={"ID":"fb107ebf-df14-44ee-8c21-06fd3c080f7b","Type":"ContainerStarted","Data":"392fbd59682633d8443f3dc4a7913fe45ecb995750f8d1d40f2b7688c91443e0"} Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.777354 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 13:58:56 crc kubenswrapper[4898]: I0313 13:58:56.790263 4898 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.725699 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fthwv" podStartSLOduration=106.725677058 podStartE2EDuration="1m46.725677058s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:58:57.724644194 +0000 UTC m=+172.726232463" watchObservedRunningTime="2026-03-13 13:58:57.725677058 +0000 UTC m=+172.727265307" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739124 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739383 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739313 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739616 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:58:57 crc kubenswrapper[4898]: I0313 13:58:57.739738 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:57 crc kubenswrapper[4898]: E0313 13:58:57.739841 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739530 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:58:59 crc kubenswrapper[4898]: I0313 13:58:59.739665 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740644 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740634 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740706 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:58:59 crc kubenswrapper[4898]: E0313 13:58:59.740540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:00 crc kubenswrapper[4898]: E0313 13:59:00.882287 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739573 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739652 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.739754 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:01 crc kubenswrapper[4898]: I0313 13:59:01.739830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.740132 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.740209 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:01 crc kubenswrapper[4898]: E0313 13:59:01.740261 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:02 crc kubenswrapper[4898]: I0313 13:59:02.740345 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:59:02 crc kubenswrapper[4898]: E0313 13:59:02.740636 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.738926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.738966 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739150 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.739182 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:03 crc kubenswrapper[4898]: I0313 13:59:03.739240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739364 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739728 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:03 crc kubenswrapper[4898]: E0313 13:59:03.739830 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739252 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.739276 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740593 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740720 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740840 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.740921 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:05 crc kubenswrapper[4898]: I0313 13:59:05.875474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.875700 4898 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.875789 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs podName:9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869 nodeName:}" failed. No retries permitted until 2026-03-13 14:00:09.875767585 +0000 UTC m=+244.877355834 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs") pod "network-metrics-daemon-fwrwc" (UID: "9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 13:59:05 crc kubenswrapper[4898]: E0313 13:59:05.882956 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.738895 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.739037 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.738937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739142 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:07 crc kubenswrapper[4898]: I0313 13:59:07.739040 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739252 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739382 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:07 crc kubenswrapper[4898]: E0313 13:59:07.739511 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739213 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739358 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:09 crc kubenswrapper[4898]: I0313 13:59:09.739385 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739517 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739678 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:09 crc kubenswrapper[4898]: E0313 13:59:09.739786 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:10 crc kubenswrapper[4898]: E0313 13:59:10.884762 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739444 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739641 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.739818 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739867 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:11 crc kubenswrapper[4898]: I0313 13:59:11.739846 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.740000 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.740163 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:11 crc kubenswrapper[4898]: E0313 13:59:11.740340 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.739303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.739418 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.739303 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.739501 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740071 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.740087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740259 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740396 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:13 crc kubenswrapper[4898]: I0313 13:59:13.740619 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:59:13 crc kubenswrapper[4898]: E0313 13:59:13.740882 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qqqs5_openshift-ovn-kubernetes(e7d6afc0-d9b5-41b2-a55f-57621c300cbb)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.738582 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.740451 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.740479 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.740544 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:15 crc kubenswrapper[4898]: I0313 13:59:15.740572 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.740955 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.741189 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.741372 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:15 crc kubenswrapper[4898]: E0313 13:59:15.885571 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739316 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739436 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.739625 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:17 crc kubenswrapper[4898]: I0313 13:59:17.739648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.739760 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.739984 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:17 crc kubenswrapper[4898]: E0313 13:59:17.740051 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.738732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.738827 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.738987 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.739081 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.739246 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.739316 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:19 crc kubenswrapper[4898]: I0313 13:59:19.740022 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:19 crc kubenswrapper[4898]: E0313 13:59:19.740234 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:20 crc kubenswrapper[4898]: E0313 13:59:20.887406 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.738825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.738983 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.739069 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739005 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:21 crc kubenswrapper[4898]: I0313 13:59:21.739100 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739177 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739254 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:21 crc kubenswrapper[4898]: E0313 13:59:21.739313 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738526 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738601 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738645 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.738704 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739302 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739475 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739588 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.739752 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.794179 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/1.log" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795068 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/0.log" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795137 4898 generic.go:334] "Generic (PLEG): container finished" podID="e521c857-9711-4f68-886f-38b233d7b05b" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" exitCode=1 Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerDied","Data":"04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770"} Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795263 4898 scope.go:117] "RemoveContainer" containerID="de3eb6944d1978b4471d9968724d0312047079b10ae547df513246554e8a705f" Mar 13 13:59:23 crc kubenswrapper[4898]: I0313 13:59:23.795681 4898 scope.go:117] "RemoveContainer" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" Mar 13 13:59:23 crc kubenswrapper[4898]: E0313 13:59:23.795899 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6llfs_openshift-multus(e521c857-9711-4f68-886f-38b233d7b05b)\"" pod="openshift-multus/multus-6llfs" podUID="e521c857-9711-4f68-886f-38b233d7b05b" Mar 13 13:59:24 crc kubenswrapper[4898]: I0313 13:59:24.801689 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/1.log" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739567 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.739655 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739668 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:25 crc kubenswrapper[4898]: I0313 13:59:25.739727 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.739972 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.740093 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.742025 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:25 crc kubenswrapper[4898]: E0313 13:59:25.888099 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.738938 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.739001 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.739168 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.739318 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.739365 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.739630 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.739738 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:27 crc kubenswrapper[4898]: E0313 13:59:27.740294 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:27 crc kubenswrapper[4898]: I0313 13:59:27.741050 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.599043 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fwrwc"] Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.599144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:28 crc kubenswrapper[4898]: E0313 13:59:28.599239 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.817287 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovnkube-controller/3.log" Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.820953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerStarted","Data":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} Mar 13 13:59:28 crc kubenswrapper[4898]: I0313 13:59:28.821456 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.738793 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.738859 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.738957 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.738996 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.739077 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:29 crc kubenswrapper[4898]: I0313 13:59:29.739138 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.739152 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:29 crc kubenswrapper[4898]: E0313 13:59:29.739347 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:30 crc kubenswrapper[4898]: E0313 13:59:30.890480 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739415 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739456 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739573 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.739731 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:31 crc kubenswrapper[4898]: I0313 13:59:31.739772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.739978 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.740089 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:31 crc kubenswrapper[4898]: E0313 13:59:31.740292 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739563 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:33 crc kubenswrapper[4898]: I0313 13:59:33.739895 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740087 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740266 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740433 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:33 crc kubenswrapper[4898]: E0313 13:59:33.740542 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:34 crc kubenswrapper[4898]: I0313 13:59:34.739695 4898 scope.go:117] "RemoveContainer" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" Mar 13 13:59:34 crc kubenswrapper[4898]: I0313 13:59:34.767157 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podStartSLOduration=143.767130112 podStartE2EDuration="2m23.767130112s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:28.85283923 +0000 UTC m=+203.854427499" watchObservedRunningTime="2026-03-13 13:59:34.767130112 +0000 UTC m=+209.768718391" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.738777 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.738850 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.738937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.739041 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.739058 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.741883 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.742069 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.742099 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.851652 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/1.log" Mar 13 13:59:35 crc kubenswrapper[4898]: I0313 13:59:35.851731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e"} Mar 13 13:59:35 crc kubenswrapper[4898]: E0313 13:59:35.891265 4898 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738492 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738546 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738582 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:37 crc kubenswrapper[4898]: I0313 13:59:37.738648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739120 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739240 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739394 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:37 crc kubenswrapper[4898]: E0313 13:59:37.739568 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.739576 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.739595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.739700 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.739712 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.739863 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.739950 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fwrwc" podUID="9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869" Mar 13 13:59:39 crc kubenswrapper[4898]: I0313 13:59:39.740087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:39 crc kubenswrapper[4898]: E0313 13:59:39.740150 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738593 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738667 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738686 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.738694 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.741614 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742186 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742270 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742290 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.742357 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 13:59:41 crc kubenswrapper[4898]: I0313 13:59:41.745503 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.217813 4898 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.267888 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9lxv"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.268390 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.268949 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.269472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.279416 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtgq"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.280153 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.282590 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.283820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.284498 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-cx59b"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.285291 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.288649 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.289408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.292565 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.292838 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.293066 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.293970 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.294747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.310322 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.310689 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.311140 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.311563 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.311748 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.312097 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.312272 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.312370 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.313723 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.313464 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.324776 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325386 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325547 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325674 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325577 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325770 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325821 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.325929 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.326134 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.326709 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.328110 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.328601 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329088 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329245 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329376 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329591 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329777 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.329871 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330473 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330508 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330677 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.330696 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.331023 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.333122 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.334456 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6plhg"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.334962 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.335249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.335473 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.335708 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.339773 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.340412 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348207 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348436 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348757 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.348827 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kgnxj"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.349120 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.349188 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-machine-approver-tls\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-config\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350163 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/096d3786-85e8-4fe5-82b3-57cd1be251a1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350190 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-images\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cvb\" (UniqueName: \"kubernetes.io/projected/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-kube-api-access-j4cvb\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350236 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f12557e-02f5-4445-988f-b19f16672e3b-serving-cert\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350255 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350305 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-auth-proxy-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zh6\" (UniqueName: \"kubernetes.io/projected/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-kube-api-access-x4zh6\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75725\" (UniqueName: \"kubernetes.io/projected/096d3786-85e8-4fe5-82b3-57cd1be251a1-kube-api-access-75725\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350453 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-config\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.350475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjc96\" (UniqueName: \"kubernetes.io/projected/6f12557e-02f5-4445-988f-b19f16672e3b-kube-api-access-pjc96\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.352487 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.352625 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.352917 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353046 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353155 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353384 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353671 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353776 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353801 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353851 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353884 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353934 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353941 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.353983 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354012 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354140 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354175 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354384 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354832 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.354976 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355071 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355322 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355421 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355533 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355541 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355570 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355604 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355634 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355660 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355700 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355733 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.355819 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.356051 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.366521 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t2s2h"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.367000 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z5vf2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.367635 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.367682 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m2ntx"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.368220 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.368408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.371022 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.371586 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.372780 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373045 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373172 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.373465 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374097 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374376 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374388 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374468 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374494 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374568 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.374569 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.379423 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.380090 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.381884 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382159 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382162 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382452 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.382803 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383039 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383052 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383197 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.383285 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.384192 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.384275 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.384588 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.385806 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.386943 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.387026 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.396743 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.398842 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.399280 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.399734 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.399810 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.400290 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.401501 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.401650 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.402211 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.403306 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.403487 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.405491 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.404209 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.404385 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.406373 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.408523 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.409191 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.410515 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.411791 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.412578 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.412891 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.415220 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xglpf"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.415330 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.416578 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.416684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.417453 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.418805 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.419712 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.419877 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.420302 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.420382 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.420969 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5r8j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.421048 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.421460 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvbms"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.421541 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422471 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422786 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422807 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.422961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.423236 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.423624 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.423892 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.424304 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.424731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.425617 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.426521 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtgq"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.427492 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9lxv"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.428480 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rd22p"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.429653 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.431255 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kgnxj"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.431407 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.433087 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.434679 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t2s2h"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.435779 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.438954 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cx59b"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.441991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.443422 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.444708 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.445753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.446792 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.448285 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.449591 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.450541 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.450974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-images\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451001 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/096d3786-85e8-4fe5-82b3-57cd1be251a1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-serving-cert\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42j8n\" (UniqueName: \"kubernetes.io/projected/a402522c-e891-477d-a2cc-5aa7c6944e06-kube-api-access-42j8n\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451057 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cvb\" (UniqueName: \"kubernetes.io/projected/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-kube-api-access-j4cvb\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-service-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fp6m\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-kube-api-access-8fp6m\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-node-pullsecrets\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451152 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqfg5\" (UniqueName: \"kubernetes.io/projected/eedd2260-f339-4e2f-83e8-13a56cee2ce6-kube-api-access-gqfg5\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451166 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f12557e-02f5-4445-988f-b19f16672e3b-serving-cert\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksjvx\" (UniqueName: \"kubernetes.io/projected/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-kube-api-access-ksjvx\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a402522c-e891-477d-a2cc-5aa7c6944e06-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451229 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451244 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451273 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451287 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-audit\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a402522c-e891-477d-a2cc-5aa7c6944e06-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451325 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451342 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-default-certificate\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451364 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451402 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8w4m\" (UniqueName: \"kubernetes.io/projected/cfd8810f-79f1-4634-9e4d-245348fba016-kube-api-access-h8w4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451424 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-auth-proxy-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451457 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd8810f-79f1-4634-9e4d-245348fba016-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-encryption-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451492 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451541 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-service-ca-bundle\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzr6p\" (UniqueName: \"kubernetes.io/projected/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-kube-api-access-lzr6p\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451575 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451613 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451632 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451649 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451695 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-client\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451709 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451724 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4zh6\" (UniqueName: \"kubernetes.io/projected/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-kube-api-access-x4zh6\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451740 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-serving-cert\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451772 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/071d8651-2a2d-4eed-9023-cfe636be09a0-metrics-tls\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451785 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-stats-auth\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-config\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75725\" (UniqueName: \"kubernetes.io/projected/096d3786-85e8-4fe5-82b3-57cd1be251a1-kube-api-access-75725\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrh2\" (UniqueName: \"kubernetes.io/projected/071d8651-2a2d-4eed-9023-cfe636be09a0-kube-api-access-kgrh2\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451883 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6bvm\" (UniqueName: \"kubernetes.io/projected/f446713d-03e3-461f-989f-eb6bdef32b30-kube-api-access-r6bvm\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451912 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-metrics-certs\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-config\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-trusted-ca\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-client\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.451987 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-config\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjc96\" (UniqueName: \"kubernetes.io/projected/6f12557e-02f5-4445-988f-b19f16672e3b-kube-api-access-pjc96\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd8810f-79f1-4634-9e4d-245348fba016-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452078 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-image-import-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452119 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgtzj\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-kube-api-access-rgtzj\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd2260-f339-4e2f-83e8-13a56cee2ce6-serving-cert\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452174 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-serving-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-audit-dir\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-machine-approver-tls\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-config\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.452723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-config\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453273 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-images\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453414 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453464 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.453476 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.455116 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.456252 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z5vf2"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.457130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/096d3786-85e8-4fe5-82b3-57cd1be251a1-config\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.457298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.457514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.458143 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-auth-proxy-config\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459050 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.459879 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/096d3786-85e8-4fe5-82b3-57cd1be251a1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.460105 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.460248 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m2ntx"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.461187 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5r8j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.462166 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.463133 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.463565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f12557e-02f5-4445-988f-b19f16672e3b-service-ca-bundle\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.464539 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.465547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xglpf"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.466553 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvbms"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.467667 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.468654 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f12557e-02f5-4445-988f-b19f16672e3b-serving-cert\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.468659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-machine-approver-tls\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.469173 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.470343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.471339 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.471848 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rd22p"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.472973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.474412 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.475575 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hqcs6"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.476796 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.480657 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mr499"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.481065 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.482229 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hqcs6"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.498279 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.511754 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.512729 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2ps4n"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.513285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.520737 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ps4n"] Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.532020 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552085 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-audit-dir\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42j8n\" (UniqueName: \"kubernetes.io/projected/a402522c-e891-477d-a2cc-5aa7c6944e06-kube-api-access-42j8n\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-serving-cert\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-service-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-audit-dir\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fp6m\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-kube-api-access-8fp6m\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552772 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552836 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-node-pullsecrets\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqfg5\" (UniqueName: \"kubernetes.io/projected/eedd2260-f339-4e2f-83e8-13a56cee2ce6-kube-api-access-gqfg5\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f446713d-03e3-461f-989f-eb6bdef32b30-node-pullsecrets\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksjvx\" (UniqueName: \"kubernetes.io/projected/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-kube-api-access-ksjvx\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.552979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a402522c-e891-477d-a2cc-5aa7c6944e06-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553003 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-audit\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a402522c-e891-477d-a2cc-5aa7c6944e06-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553134 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553159 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-default-certificate\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553197 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8w4m\" (UniqueName: \"kubernetes.io/projected/cfd8810f-79f1-4634-9e4d-245348fba016-kube-api-access-h8w4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd8810f-79f1-4634-9e4d-245348fba016-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553321 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-encryption-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553366 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-service-ca-bundle\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553435 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzr6p\" (UniqueName: \"kubernetes.io/projected/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-kube-api-access-lzr6p\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553579 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-client\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a402522c-e891-477d-a2cc-5aa7c6944e06-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553599 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-service-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553632 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-serving-cert\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-ca\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/071d8651-2a2d-4eed-9023-cfe636be09a0-metrics-tls\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553730 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-stats-auth\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553771 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-config\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553817 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrh2\" (UniqueName: \"kubernetes.io/projected/071d8651-2a2d-4eed-9023-cfe636be09a0-kube-api-access-kgrh2\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553853 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6bvm\" (UniqueName: \"kubernetes.io/projected/f446713d-03e3-461f-989f-eb6bdef32b30-kube-api-access-r6bvm\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-metrics-certs\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553883 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-trusted-ca\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553936 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-client\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553950 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-config\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd8810f-79f1-4634-9e4d-245348fba016-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.553989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554025 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-image-import-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554064 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgtzj\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-kube-api-access-rgtzj\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554111 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554134 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554156 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd2260-f339-4e2f-83e8-13a56cee2ce6-serving-cert\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-serving-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554277 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-audit\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.554979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.555229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-serving-cert\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.555510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-serving-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.557814 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-trusted-ca\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.558553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-encryption-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.558672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-service-ca-bundle\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.558782 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-serving-cert\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.559524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cfd8810f-79f1-4634-9e4d-245348fba016-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.559804 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f446713d-03e3-461f-989f-eb6bdef32b30-etcd-client\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.559890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.560086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-stats-auth\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.560759 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfd8810f-79f1-4634-9e4d-245348fba016-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.560971 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-etcd-client\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.561236 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eedd2260-f339-4e2f-83e8-13a56cee2ce6-config\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.561605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-config\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.562269 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-image-import-ca\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.562371 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-default-certificate\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.562664 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f446713d-03e3-461f-989f-eb6bdef32b30-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563128 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a402522c-e891-477d-a2cc-5aa7c6944e06-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eedd2260-f339-4e2f-83e8-13a56cee2ce6-serving-cert\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.563747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.565003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.566417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-metrics-certs\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.568694 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-trusted-ca\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.570135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-metrics-tls\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.571534 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.611254 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.616486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/071d8651-2a2d-4eed-9023-cfe636be09a0-metrics-tls\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.632029 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.651759 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.661065 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-config\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.671978 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.673669 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.691339 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.695771 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.717863 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.725646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.735768 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.746547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.751471 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.755701 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:47 crc kubenswrapper[4898]: E0313 13:59:47.756239 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 14:01:49.75609764 +0000 UTC m=+344.757685919 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.762706 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.779937 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.787013 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.791588 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.811384 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.825135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.831543 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.851541 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857494 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.857568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.858677 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.861592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.861784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.862306 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.862530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.872067 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.877070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.892687 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.897449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.912423 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.923822 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.931609 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.952767 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.958592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:47 crc kubenswrapper[4898]: I0313 13:59:47.992117 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.011874 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.032507 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.052170 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.063890 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.071558 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.075334 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.080655 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.091937 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.111110 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.132201 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.152733 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.173386 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.192501 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.212445 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.232801 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.251399 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.274681 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.304419 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: W0313 13:59:48.316068 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad WatchSource:0}: Error finding container bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad: Status 404 returned error can't find the container with id bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.317192 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.331797 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.351966 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 13:59:48 crc kubenswrapper[4898]: W0313 13:59:48.359738 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067 WatchSource:0}: Error finding container d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067: Status 404 returned error can't find the container with id d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067 Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.372277 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.392749 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.411649 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.429828 4898 request.go:700] Waited for 1.015170279s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/configmaps?fieldSelector=metadata.name%3Dmachine-config-operator-images&limit=500&resourceVersion=0 Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.431176 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.452625 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.472644 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.491800 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.516864 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.531415 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.551164 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.571168 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.591699 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.612098 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.631379 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.651431 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.672095 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.692020 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.712048 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.732090 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.752832 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.772555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.793664 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.813355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.833201 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.852572 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.873144 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.892721 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.907303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"057a32285031b97e8136a089f968663acc6f61e7493bd8c07413977c3178b92b"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.907379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bc8fedc4dfcb18561dbb855f3f92d636901e36d2e1152a743ae6370dca5742ad"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.909536 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c0ca998f58d0078098893b001e96d917adab7933ab6e364aad678ddf2942fdf0"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.909560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d32d55c6f7feceb2714fa2f5fdd13c5fa15b68553291e50c6eae9aef459d0067"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.909859 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.912080 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.913681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6ffff5ce359da09accce3736b45d0b52d9dd016da11fb2a399d00a204cb51e15"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.913779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a925ab650703812668df21ce8bfb6cb4e7119903285412289b894645d90a70e5"} Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.931348 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.952369 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.971395 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 13:59:48 crc kubenswrapper[4898]: I0313 13:59:48.992354 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.011204 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.032524 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.050916 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.071589 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.091787 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.112307 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.132309 4898 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.134158 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.134211 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.151880 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.186711 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cvb\" (UniqueName: \"kubernetes.io/projected/ade74420-c7a1-4b89-b6c8-7970d7b6c17c-kube-api-access-j4cvb\") pod \"machine-approver-56656f9798-9vdwm\" (UID: \"ade74420-c7a1-4b89-b6c8-7970d7b6c17c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.218179 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75725\" (UniqueName: \"kubernetes.io/projected/096d3786-85e8-4fe5-82b3-57cd1be251a1-kube-api-access-75725\") pod \"machine-api-operator-5694c8668f-whtgq\" (UID: \"096d3786-85e8-4fe5-82b3-57cd1be251a1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.239308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjc96\" (UniqueName: \"kubernetes.io/projected/6f12557e-02f5-4445-988f-b19f16672e3b-kube-api-access-pjc96\") pod \"authentication-operator-69f744f599-v9lxv\" (UID: \"6f12557e-02f5-4445-988f-b19f16672e3b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.245206 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4zh6\" (UniqueName: \"kubernetes.io/projected/a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8-kube-api-access-x4zh6\") pod \"cluster-samples-operator-665b6dd947-8rx7x\" (UID: \"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.253407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.271406 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.291562 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.310629 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.331568 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.351597 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.372604 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.391786 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.412011 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.426864 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.434707 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 13:59:49 crc kubenswrapper[4898]: W0313 13:59:49.441697 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade74420_c7a1_4b89_b6c8_7970d7b6c17c.slice/crio-c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021 WatchSource:0}: Error finding container c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021: Status 404 returned error can't find the container with id c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021 Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.450487 4898 request.go:700] Waited for 1.897685154s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.451638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.476679 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42j8n\" (UniqueName: \"kubernetes.io/projected/a402522c-e891-477d-a2cc-5aa7c6944e06-kube-api-access-42j8n\") pod \"openshift-apiserver-operator-796bbdcf4f-5jp6r\" (UID: \"a402522c-e891-477d-a2cc-5aa7c6944e06\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.484643 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.488499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fp6m\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-kube-api-access-8fp6m\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.508526 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.511890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqfg5\" (UniqueName: \"kubernetes.io/projected/eedd2260-f339-4e2f-83e8-13a56cee2ce6-kube-api-access-gqfg5\") pod \"console-operator-58897d9998-kgnxj\" (UID: \"eedd2260-f339-4e2f-83e8-13a56cee2ce6\") " pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.523379 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.527283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksjvx\" (UniqueName: \"kubernetes.io/projected/9f5a2d7c-1d38-4e82-89e0-039d0f515ac6-kube-api-access-ksjvx\") pod \"etcd-operator-b45778765-t2s2h\" (UID: \"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.548384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgtzj\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-kube-api-access-rgtzj\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.563263 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.568774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzr6p\" (UniqueName: \"kubernetes.io/projected/18e5c8bf-9fe0-465e-af8f-9e7ec7400be8-kube-api-access-lzr6p\") pod \"router-default-5444994796-6plhg\" (UID: \"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8\") " pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.586137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6bvm\" (UniqueName: \"kubernetes.io/projected/f446713d-03e3-461f-989f-eb6bdef32b30-kube-api-access-r6bvm\") pod \"apiserver-76f77b778f-z5vf2\" (UID: \"f446713d-03e3-461f-989f-eb6bdef32b30\") " pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.607835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vk7r4\" (UID: \"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.609737 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.618803 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.626357 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.628523 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"route-controller-manager-6576b87f9c-gwvk4\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.641336 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v9lxv"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.641470 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:49 crc kubenswrapper[4898]: W0313 13:59:49.643866 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e5c8bf_9fe0_465e_af8f_9e7ec7400be8.slice/crio-07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a WatchSource:0}: Error finding container 07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a: Status 404 returned error can't find the container with id 07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.646856 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"oauth-openshift-558db77b4-djn5q\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.650537 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.665883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrh2\" (UniqueName: \"kubernetes.io/projected/071d8651-2a2d-4eed-9023-cfe636be09a0-kube-api-access-kgrh2\") pod \"dns-operator-744455d44c-m2ntx\" (UID: \"071d8651-2a2d-4eed-9023-cfe636be09a0\") " pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.689172 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.694224 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.698688 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/22f99dde-8f14-4e43-af7d-fe6e5ec2a908-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t4j9v\" (UID: \"22f99dde-8f14-4e43-af7d-fe6e5ec2a908\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.709445 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.714559 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8w4m\" (UniqueName: \"kubernetes.io/projected/cfd8810f-79f1-4634-9e4d-245348fba016-kube-api-access-h8w4m\") pod \"openshift-controller-manager-operator-756b6f6bc6-xxjrs\" (UID: \"cfd8810f-79f1-4634-9e4d-245348fba016\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.715979 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtgq"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792014 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792318 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792365 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-encryption-config\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792428 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792461 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792480 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2w5p\" (UniqueName: \"kubernetes.io/projected/9c7e70de-de85-421c-aaeb-476450d8e0ee-kube-api-access-h2w5p\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-client\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792512 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792529 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-policies\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f726v\" (UniqueName: \"kubernetes.io/projected/1a01ab05-7178-48c7-892b-b91cf60432f8-kube-api-access-f726v\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792578 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792604 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-dir\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvtz8\" (UniqueName: \"kubernetes.io/projected/f4f26c0f-992a-4eb4-86d2-58e42a5b2b68-kube-api-access-tvtz8\") pod \"downloads-7954f5f757-cx59b\" (UID: \"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68\") " pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792669 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792717 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792734 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792769 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-serving-cert\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792807 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c7e70de-de85-421c-aaeb-476450d8e0ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792850 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7e70de-de85-421c-aaeb-476450d8e0ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.792913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: E0313 13:59:49.794775 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.294750358 +0000 UTC m=+225.296338617 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.833147 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893562 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893886 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1225410-7280-4409-8934-c6766eae5088-signing-key\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893916 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893950 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.893965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894009 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzgg\" (UniqueName: \"kubernetes.io/projected/d1225410-7280-4409-8934-c6766eae5088-kube-api-access-ztzgg\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894027 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6444bf97-84ef-49df-afcd-4e939a5de2ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894062 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-srv-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894092 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cb0051-a6f4-4790-b51c-3da149327edd-cert\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894150 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55a96934-e740-402f-b4af-488a7eba53ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894167 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hckk7\" (UniqueName: \"kubernetes.io/projected/55a96934-e740-402f-b4af-488a7eba53ae-kube-api-access-hckk7\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894197 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6be656-c448-4b38-b5a8-2401ab767c54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-plugins-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894226 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ebc90f-88a0-476c-98d6-c595517196b3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894242 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894261 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-csi-data-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gcl\" (UniqueName: \"kubernetes.io/projected/9af6faad-479e-481b-9f66-d074c1c20ce8-kube-api-access-b5gcl\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ebc90f-88a0-476c-98d6-c595517196b3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894351 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbhjt\" (UniqueName: \"kubernetes.io/projected/2e6be656-c448-4b38-b5a8-2401ab767c54-kube-api-access-qbhjt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894383 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894408 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfaa00dc-cff6-47b5-878f-886fab80071b-config\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894426 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894440 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfaa00dc-cff6-47b5-878f-886fab80071b-serving-cert\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8chc\" (UniqueName: \"kubernetes.io/projected/a0416bba-76e3-4312-94e0-ac5b77c6ace0-kube-api-access-j8chc\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894534 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-mountpoint-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-registration-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894581 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk2f8\" (UniqueName: \"kubernetes.io/projected/dfaa00dc-cff6-47b5-878f-886fab80071b-kube-api-access-fk2f8\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-socket-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894627 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-node-bootstrap-token\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-apiservice-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894658 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-webhook-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894793 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894857 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894924 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894942 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zr5q\" (UniqueName: \"kubernetes.io/projected/f794406f-fc28-4e2f-953d-ab45e36cc754-kube-api-access-9zr5q\") pod \"migrator-59844c95c7-c6rz7\" (UID: \"f794406f-fc28-4e2f-953d-ab45e36cc754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894955 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-srv-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.894997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7e70de-de85-421c-aaeb-476450d8e0ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895070 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66ebc90f-88a0-476c-98d6-c595517196b3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895101 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4sls\" (UniqueName: \"kubernetes.io/projected/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-kube-api-access-f4sls\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895152 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxwz\" (UniqueName: \"kubernetes.io/projected/7d27543e-df10-41f7-be85-dfe319aaec8a-kube-api-access-nrxwz\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9dnr\" (UniqueName: \"kubernetes.io/projected/d1df7055-9dee-4cde-a787-bc18a276b777-kube-api-access-k9dnr\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895200 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-images\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-config\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqqxt\" (UniqueName: \"kubernetes.io/projected/c8b0b1cf-022c-4181-a957-2f7e172a3294-kube-api-access-cqqxt\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895278 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-encryption-config\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895328 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad924960-c3fd-4412-9b39-0723a598d86d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zcr\" (UniqueName: \"kubernetes.io/projected/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-kube-api-access-t9zcr\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895419 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af6faad-479e-481b-9f66-d074c1c20ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895440 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv52f\" (UniqueName: \"kubernetes.io/projected/6444bf97-84ef-49df-afcd-4e939a5de2ad-kube-api-access-nv52f\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895489 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2w5p\" (UniqueName: \"kubernetes.io/projected/9c7e70de-de85-421c-aaeb-476450d8e0ee-kube-api-access-h2w5p\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.895506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-client\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-285rt\" (UniqueName: \"kubernetes.io/projected/22cb0051-a6f4-4790-b51c-3da149327edd-kube-api-access-285rt\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897556 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897591 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq68p\" (UniqueName: \"kubernetes.io/projected/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-kube-api-access-dq68p\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897627 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-policies\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6be656-c448-4b38-b5a8-2401ab767c54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9969w\" (UniqueName: \"kubernetes.io/projected/41000ce4-1a84-44de-b283-1fe0350b1c17-kube-api-access-9969w\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f726v\" (UniqueName: \"kubernetes.io/projected/1a01ab05-7178-48c7-892b-b91cf60432f8-kube-api-access-f726v\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0416bba-76e3-4312-94e0-ac5b77c6ace0-metrics-tls\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897796 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-dir\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897861 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1225410-7280-4409-8934-c6766eae5088-signing-cabundle\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897912 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvtz8\" (UniqueName: \"kubernetes.io/projected/f4f26c0f-992a-4eb4-86d2-58e42a5b2b68-kube-api-access-tvtz8\") pod \"downloads-7954f5f757-cx59b\" (UID: \"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68\") " pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897931 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0416bba-76e3-4312-94e0-ac5b77c6ace0-config-volume\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.897985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898003 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad924960-c3fd-4412-9b39-0723a598d86d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad924960-c3fd-4412-9b39-0723a598d86d-config\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898087 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-tmpfs\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898203 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: E0313 13:59:49.898253 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.398235637 +0000 UTC m=+225.399823866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898308 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898442 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"auto-csr-approver-29556838-h7pkr\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-serving-cert\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c7e70de-de85-421c-aaeb-476450d8e0ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898827 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-proxy-tls\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-certs\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.898916 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.900161 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9c7e70de-de85-421c-aaeb-476450d8e0ee-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.900274 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.900998 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.901070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.902851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.903196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-dir\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.903214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.903372 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-audit-policies\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.904219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.904293 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a01ab05-7178-48c7-892b-b91cf60432f8-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905517 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.905867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-serving-cert\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.906064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c7e70de-de85-421c-aaeb-476450d8e0ee-serving-cert\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.906730 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.911648 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-encryption-config\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.911818 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.911875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.913047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.914298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.916352 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.917661 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1a01ab05-7178-48c7-892b-b91cf60432f8-etcd-client\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.922519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerStarted","Data":"c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.922566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerStarted","Data":"07d462c95de220b26b61c71fe04f7db640c2a0648bbcb012450f83b22cbe5d4a"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.926558 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" event={"ID":"ade74420-c7a1-4b89-b6c8-7970d7b6c17c","Type":"ContainerStarted","Data":"a90eb9072360ab6ffcc9dd0976c83ee1c38d5248c10036f6319955fcd85b0714"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.926588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" event={"ID":"ade74420-c7a1-4b89-b6c8-7970d7b6c17c","Type":"ContainerStarted","Data":"c3bb23c34123cbab89cddad63cb25b898b7ddce6417f31ab796a400379737021"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.927534 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2w5p\" (UniqueName: \"kubernetes.io/projected/9c7e70de-de85-421c-aaeb-476450d8e0ee-kube-api-access-h2w5p\") pod \"openshift-config-operator-7777fb866f-7zpw2\" (UID: \"9c7e70de-de85-421c-aaeb-476450d8e0ee\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.929345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" event={"ID":"096d3786-85e8-4fe5-82b3-57cd1be251a1","Type":"ContainerStarted","Data":"ddb16b36ce9e24c86f9837e204782820857278d7a7741a6518de23e51d82b48d"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.931462 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerStarted","Data":"10a096fb4e024f13b0375adff8f9af56e97cc2b0078dae71d54e42f6db24f3c3"} Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.933879 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.957326 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.979227 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x"] Mar 13 13:59:49 crc kubenswrapper[4898]: I0313 13:59:49.987705 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f726v\" (UniqueName: \"kubernetes.io/projected/1a01ab05-7178-48c7-892b-b91cf60432f8-kube-api-access-f726v\") pod \"apiserver-7bbb656c7d-g559r\" (UID: \"1a01ab05-7178-48c7-892b-b91cf60432f8\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzgg\" (UniqueName: \"kubernetes.io/projected/d1225410-7280-4409-8934-c6766eae5088-kube-api-access-ztzgg\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6444bf97-84ef-49df-afcd-4e939a5de2ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000627 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-srv-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000649 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cb0051-a6f4-4790-b51c-3da149327edd-cert\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55a96934-e740-402f-b4af-488a7eba53ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hckk7\" (UniqueName: \"kubernetes.io/projected/55a96934-e740-402f-b4af-488a7eba53ae-kube-api-access-hckk7\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000740 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ebc90f-88a0-476c-98d6-c595517196b3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6be656-c448-4b38-b5a8-2401ab767c54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000802 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-plugins-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000824 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-csi-data-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gcl\" (UniqueName: \"kubernetes.io/projected/9af6faad-479e-481b-9f66-d074c1c20ce8-kube-api-access-b5gcl\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ebc90f-88a0-476c-98d6-c595517196b3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbhjt\" (UniqueName: \"kubernetes.io/projected/2e6be656-c448-4b38-b5a8-2401ab767c54-kube-api-access-qbhjt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000974 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfaa00dc-cff6-47b5-878f-886fab80071b-config\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.000996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfaa00dc-cff6-47b5-878f-886fab80071b-serving-cert\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-plugins-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8chc\" (UniqueName: \"kubernetes.io/projected/a0416bba-76e3-4312-94e0-ac5b77c6ace0-kube-api-access-j8chc\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-mountpoint-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001824 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-registration-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk2f8\" (UniqueName: \"kubernetes.io/projected/dfaa00dc-cff6-47b5-878f-886fab80071b-kube-api-access-fk2f8\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001892 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-socket-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-node-bootstrap-token\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.001981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-apiservice-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-webhook-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002052 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002083 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zr5q\" (UniqueName: \"kubernetes.io/projected/f794406f-fc28-4e2f-953d-ab45e36cc754-kube-api-access-9zr5q\") pod \"migrator-59844c95c7-c6rz7\" (UID: \"f794406f-fc28-4e2f-953d-ab45e36cc754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002137 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-srv-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002200 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66ebc90f-88a0-476c-98d6-c595517196b3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002266 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4sls\" (UniqueName: \"kubernetes.io/projected/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-kube-api-access-f4sls\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002283 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxwz\" (UniqueName: \"kubernetes.io/projected/7d27543e-df10-41f7-be85-dfe319aaec8a-kube-api-access-nrxwz\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002300 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9dnr\" (UniqueName: \"kubernetes.io/projected/d1df7055-9dee-4cde-a787-bc18a276b777-kube-api-access-k9dnr\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002318 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-images\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002337 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002353 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-config\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqqxt\" (UniqueName: \"kubernetes.io/projected/c8b0b1cf-022c-4181-a957-2f7e172a3294-kube-api-access-cqqxt\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad924960-c3fd-4412-9b39-0723a598d86d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002423 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zcr\" (UniqueName: \"kubernetes.io/projected/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-kube-api-access-t9zcr\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af6faad-479e-481b-9f66-d074c1c20ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002612 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv52f\" (UniqueName: \"kubernetes.io/projected/6444bf97-84ef-49df-afcd-4e939a5de2ad-kube-api-access-nv52f\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002634 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002658 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-285rt\" (UniqueName: \"kubernetes.io/projected/22cb0051-a6f4-4790-b51c-3da149327edd-kube-api-access-285rt\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq68p\" (UniqueName: \"kubernetes.io/projected/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-kube-api-access-dq68p\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002746 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6be656-c448-4b38-b5a8-2401ab767c54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002770 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9969w\" (UniqueName: \"kubernetes.io/projected/41000ce4-1a84-44de-b283-1fe0350b1c17-kube-api-access-9969w\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002820 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0416bba-76e3-4312-94e0-ac5b77c6ace0-metrics-tls\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002865 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1225410-7280-4409-8934-c6766eae5088-signing-cabundle\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0416bba-76e3-4312-94e0-ac5b77c6ace0-config-volume\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad924960-c3fd-4412-9b39-0723a598d86d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.002981 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad924960-c3fd-4412-9b39-0723a598d86d-config\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003002 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-tmpfs\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"auto-csr-approver-29556838-h7pkr\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003158 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-proxy-tls\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003184 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-certs\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.003229 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1225410-7280-4409-8934-c6766eae5088-signing-key\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-socket-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfaa00dc-cff6-47b5-878f-886fab80071b-config\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-csi-data-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.005762 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66ebc90f-88a0-476c-98d6-c595517196b3-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.008232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6be656-c448-4b38-b5a8-2401ab767c54-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.008837 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.010401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a0416bba-76e3-4312-94e0-ac5b77c6ace0-config-volume\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.010438 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-mountpoint-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.010489 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41000ce4-1a84-44de-b283-1fe0350b1c17-registration-dir\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.011028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-images\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.015061 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-webhook-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.012221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-config\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.012474 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.512458202 +0000 UTC m=+225.514046441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.012510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66ebc90f-88a0-476c-98d6-c595517196b3-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.013118 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.013242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9af6faad-479e-481b-9f66-d074c1c20ce8-auth-proxy-config\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.013266 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-tmpfs\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.014400 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad924960-c3fd-4412-9b39-0723a598d86d-config\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.011195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.015678 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1225410-7280-4409-8934-c6766eae5088-signing-cabundle\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.017702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-node-bootstrap-token\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.018042 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a0416bba-76e3-4312-94e0-ac5b77c6ace0-metrics-tls\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.018420 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-srv-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.018813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9af6faad-479e-481b-9f66-d074c1c20ce8-proxy-tls\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.019384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d1df7055-9dee-4cde-a787-bc18a276b777-certs\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.019637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7d27543e-df10-41f7-be85-dfe319aaec8a-profile-collector-cert\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020362 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/55a96934-e740-402f-b4af-488a7eba53ae-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020462 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6be656-c448-4b38-b5a8-2401ab767c54-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020555 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-srv-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.020841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1225410-7280-4409-8934-c6766eae5088-signing-key\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.021519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.022635 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvtz8\" (UniqueName: \"kubernetes.io/projected/f4f26c0f-992a-4eb4-86d2-58e42a5b2b68-kube-api-access-tvtz8\") pod \"downloads-7954f5f757-cx59b\" (UID: \"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68\") " pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.023739 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-proxy-tls\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.023835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.024082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22cb0051-a6f4-4790-b51c-3da149327edd-cert\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.024150 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.024824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfaa00dc-cff6-47b5-878f-886fab80071b-serving-cert\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.025034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8b0b1cf-022c-4181-a957-2f7e172a3294-profile-collector-cert\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.035673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6444bf97-84ef-49df-afcd-4e939a5de2ad-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.035990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad924960-c3fd-4412-9b39-0723a598d86d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.036383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-apiservice-cert\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.036529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"controller-manager-879f6c89f-pvbpt\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.046733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"console-f9d7485db-7l2pm\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.085526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.098865 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.106816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.106984 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.606955626 +0000 UTC m=+225.608543865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.107152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.107477 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.607470759 +0000 UTC m=+225.609058998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.124022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzgg\" (UniqueName: \"kubernetes.io/projected/d1225410-7280-4409-8934-c6766eae5088-kube-api-access-ztzgg\") pod \"service-ca-9c57cc56f-z5r8j\" (UID: \"d1225410-7280-4409-8934-c6766eae5088\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.135226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv52f\" (UniqueName: \"kubernetes.io/projected/6444bf97-84ef-49df-afcd-4e939a5de2ad-kube-api-access-nv52f\") pod \"control-plane-machine-set-operator-78cbb6b69f-qt7gm\" (UID: \"6444bf97-84ef-49df-afcd-4e939a5de2ad\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.138839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.141540 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.143672 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kgnxj"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.145499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gcl\" (UniqueName: \"kubernetes.io/projected/9af6faad-479e-481b-9f66-d074c1c20ce8-kube-api-access-b5gcl\") pod \"machine-config-operator-74547568cd-98qz7\" (UID: \"9af6faad-479e-481b-9f66-d074c1c20ce8\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.151322 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 13:59:50 crc kubenswrapper[4898]: W0313 13:59:50.157972 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeedd2260_f339_4e2f_83e8_13a56cee2ce6.slice/crio-700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae WatchSource:0}: Error finding container 700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae: Status 404 returned error can't find the container with id 700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.168594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbhjt\" (UniqueName: \"kubernetes.io/projected/2e6be656-c448-4b38-b5a8-2401ab767c54-kube-api-access-qbhjt\") pod \"kube-storage-version-migrator-operator-b67b599dd-xwdnn\" (UID: \"2e6be656-c448-4b38-b5a8-2401ab767c54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.188883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hckk7\" (UniqueName: \"kubernetes.io/projected/55a96934-e740-402f-b4af-488a7eba53ae-kube-api-access-hckk7\") pod \"multus-admission-controller-857f4d67dd-xglpf\" (UID: \"55a96934-e740-402f-b4af-488a7eba53ae\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.189135 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.199525 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.215165 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.216312 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.716294605 +0000 UTC m=+225.717882834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.233308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8chc\" (UniqueName: \"kubernetes.io/projected/a0416bba-76e3-4312-94e0-ac5b77c6ace0-kube-api-access-j8chc\") pod \"dns-default-hqcs6\" (UID: \"a0416bba-76e3-4312-94e0-ac5b77c6ace0\") " pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.237308 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t2s2h"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.239861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"collect-profiles-29556825-92fd8\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.249987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4sls\" (UniqueName: \"kubernetes.io/projected/7667c5a1-aecb-4ccd-b8fd-e20c2c049472-kube-api-access-f4sls\") pod \"packageserver-d55dfcdfc-x85vd\" (UID: \"7667c5a1-aecb-4ccd-b8fd-e20c2c049472\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.256050 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.258494 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.272539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxwz\" (UniqueName: \"kubernetes.io/projected/7d27543e-df10-41f7-be85-dfe319aaec8a-kube-api-access-nrxwz\") pod \"catalog-operator-68c6474976-qtdtw\" (UID: \"7d27543e-df10-41f7-be85-dfe319aaec8a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.288614 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-285rt\" (UniqueName: \"kubernetes.io/projected/22cb0051-a6f4-4790-b51c-3da149327edd-kube-api-access-285rt\") pod \"ingress-canary-2ps4n\" (UID: \"22cb0051-a6f4-4790-b51c-3da149327edd\") " pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.292314 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.303939 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.315313 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"auto-csr-approver-29556838-h7pkr\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.317117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.317413 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.817401847 +0000 UTC m=+225.818990086 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.333174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq68p\" (UniqueName: \"kubernetes.io/projected/c8d8e11d-3717-47fd-a5c6-b8f52f19147b-kube-api-access-dq68p\") pod \"machine-config-controller-84d6567774-nqrmd\" (UID: \"c8d8e11d-3717-47fd-a5c6-b8f52f19147b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.342922 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.347608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9dnr\" (UniqueName: \"kubernetes.io/projected/d1df7055-9dee-4cde-a787-bc18a276b777-kube-api-access-k9dnr\") pod \"machine-config-server-mr499\" (UID: \"d1df7055-9dee-4cde-a787-bc18a276b777\") " pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.359113 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.367597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.374949 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk2f8\" (UniqueName: \"kubernetes.io/projected/dfaa00dc-cff6-47b5-878f-886fab80071b-kube-api-access-fk2f8\") pod \"service-ca-operator-777779d784-cvbms\" (UID: \"dfaa00dc-cff6-47b5-878f-886fab80071b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.376645 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.392315 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.398172 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"marketplace-operator-79b997595-p8r99\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.398566 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m2ntx"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.399510 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-cx59b"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.402322 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.408298 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z5vf2"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.412840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqqxt\" (UniqueName: \"kubernetes.io/projected/c8b0b1cf-022c-4181-a957-2f7e172a3294-kube-api-access-cqqxt\") pod \"olm-operator-6b444d44fb-k6lrz\" (UID: \"c8b0b1cf-022c-4181-a957-2f7e172a3294\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.417624 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.417993 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.418383 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:50.918365466 +0000 UTC m=+225.919953705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.428101 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad924960-c3fd-4412-9b39-0723a598d86d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mrr5j\" (UID: \"ad924960-c3fd-4412-9b39-0723a598d86d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.432014 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.432438 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.446731 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.453724 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zcr\" (UniqueName: \"kubernetes.io/projected/8675f0f0-7d3b-41d9-959e-e73f78f32c5c-kube-api-access-t9zcr\") pod \"package-server-manager-789f6589d5-x4nxn\" (UID: \"8675f0f0-7d3b-41d9-959e-e73f78f32c5c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.454705 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.466060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zr5q\" (UniqueName: \"kubernetes.io/projected/f794406f-fc28-4e2f-953d-ab45e36cc754-kube-api-access-9zr5q\") pod \"migrator-59844c95c7-c6rz7\" (UID: \"f794406f-fc28-4e2f-953d-ab45e36cc754\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.477544 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.486466 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.487487 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9969w\" (UniqueName: \"kubernetes.io/projected/41000ce4-1a84-44de-b283-1fe0350b1c17-kube-api-access-9969w\") pod \"csi-hostpathplugin-rd22p\" (UID: \"41000ce4-1a84-44de-b283-1fe0350b1c17\") " pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.491359 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.507670 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.515279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-lwzpn\" (UID: \"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.515521 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mr499" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.520253 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.520595 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.020580695 +0000 UTC m=+226.022168934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.521721 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2ps4n" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.529744 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66ebc90f-88a0-476c-98d6-c595517196b3-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-xjx7v\" (UID: \"66ebc90f-88a0-476c-98d6-c595517196b3\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.530294 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2"] Mar 13 13:59:50 crc kubenswrapper[4898]: W0313 13:59:50.570275 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c7e70de_de85_421c_aaeb_476450d8e0ee.slice/crio-c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83 WatchSource:0}: Error finding container c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83: Status 404 returned error can't find the container with id c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83 Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.593483 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.611498 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.619060 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:50 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:50 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:50 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.619130 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.621581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.622174 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.122153388 +0000 UTC m=+226.123741627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.627036 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.636291 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.642087 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.666157 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.677306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5r8j"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.683910 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.711541 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.717445 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.717489 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.723304 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.727579 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.227545372 +0000 UTC m=+226.229133611 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: W0313 13:59:50.742890 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4372b422_23c7_46bc_aec4_aef665acbda1.slice/crio-ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e WatchSource:0}: Error finding container ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e: Status 404 returned error can't find the container with id ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.822199 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.829422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.829549 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.329531246 +0000 UTC m=+226.331119485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.829715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.830024 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.330013007 +0000 UTC m=+226.331601246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.872563 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.930274 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.930314 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.930438 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.430415742 +0000 UTC m=+226.432003981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.930607 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:50 crc kubenswrapper[4898]: E0313 13:59:50.931040 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.431024887 +0000 UTC m=+226.432613126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.959920 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" event={"ID":"6444bf97-84ef-49df-afcd-4e939a5de2ad","Type":"ContainerStarted","Data":"85585555f05f37157815eee5486f1b53c522c569866b3a4a908126f95eccb25f"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.970716 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7"] Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.972656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" event={"ID":"cfd8810f-79f1-4634-9e4d-245348fba016","Type":"ContainerStarted","Data":"38c5e00ca050df6338a3ac23b3ae7f41a44bcf7f34981305589959f66bacb4e1"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.972710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" event={"ID":"cfd8810f-79f1-4634-9e4d-245348fba016","Type":"ContainerStarted","Data":"9f2f67c90828e74895837ab966c82bca169abd9d449ed4e19462c03b31540ac5"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.974617 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" event={"ID":"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8","Type":"ContainerStarted","Data":"40db5ead3f35fd7e9f31497cd14e9da8aacbec45b14d38854af9198357e129f3"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.974645 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" event={"ID":"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8","Type":"ContainerStarted","Data":"54669a4eb32ce5f819f2ab3e41eeb9e584dc1c1637b7ae1b674adabfa2f3697e"} Mar 13 13:59:50 crc kubenswrapper[4898]: I0313 13:59:50.995183 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.001065 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" event={"ID":"a402522c-e891-477d-a2cc-5aa7c6944e06","Type":"ContainerStarted","Data":"b71d37a9b192a568417ada3e35a3e4ffd2df69d4e7eb13179c9c41cb0c662f84"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.001102 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" event={"ID":"a402522c-e891-477d-a2cc-5aa7c6944e06","Type":"ContainerStarted","Data":"8c08024d3c0de35ea9f364b67796b99f75c6db15f5a6f92c9b72d29958091ed3"} Mar 13 13:59:51 crc kubenswrapper[4898]: W0313 13:59:51.001872 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8d8e11d_3717_47fd_a5c6_b8f52f19147b.slice/crio-4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4 WatchSource:0}: Error finding container 4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4: Status 404 returned error can't find the container with id 4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4 Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.004911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" event={"ID":"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6","Type":"ContainerStarted","Data":"f1c032477911f00a8677de891818a57f20a171c3f57edeaa089ea6f30ea56258"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.004945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" event={"ID":"9f5a2d7c-1d38-4e82-89e0-039d0f515ac6","Type":"ContainerStarted","Data":"8d6f6a844f1f050f625f8cbe23c894e4c729aaad2381b89fd751edb96ef80439"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.019668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerStarted","Data":"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.019705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerStarted","Data":"ea436643cbf70a5d9613211ff3822168c4bccbc29674c27f320c9a9a2e6fcdbf"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.020148 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.025010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" event={"ID":"1a01ab05-7178-48c7-892b-b91cf60432f8","Type":"ContainerStarted","Data":"73126ce78a1c88f42bd877eae252c360d01ab5e1603e33c3d7b203df67250a9e"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.031509 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.031926 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.531906174 +0000 UTC m=+226.533494403 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.034746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" event={"ID":"eedd2260-f339-4e2f-83e8-13a56cee2ce6","Type":"ContainerStarted","Data":"3b8a8b92fcb6d2069c22b60f0771b1cd2665489593ac838d1c6a3929d94d05cf"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.034782 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" event={"ID":"eedd2260-f339-4e2f-83e8-13a56cee2ce6","Type":"ContainerStarted","Data":"700c6042be9e525e8749e066e5d1abd797663a173007f1872c37d99868a2e4ae"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.035115 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038471 4898 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gwvk4 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038516 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" event={"ID":"22f99dde-8f14-4e43-af7d-fe6e5ec2a908","Type":"ContainerStarted","Data":"25cc4ffc6f33a805f22b0eea26475a9975c83c06bae94a8b4f47b778d0824cd3"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.038969 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" event={"ID":"22f99dde-8f14-4e43-af7d-fe6e5ec2a908","Type":"ContainerStarted","Data":"c952792a9aea146bfa3cac8b34ea6e3871545424014dd47e14dc032521c2bd37"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.042659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerStarted","Data":"2d5714977afe363a0af3e9631742fe59289a173d68823730a2b889e9e03736c1"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.048971 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerStarted","Data":"ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.050165 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.050227 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 13 13:59:51 crc kubenswrapper[4898]: W0313 13:59:51.062379 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8b0b1cf_022c_4181_a957_2f7e172a3294.slice/crio-34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215 WatchSource:0}: Error finding container 34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215: Status 404 returned error can't find the container with id 34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215 Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.074314 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" event={"ID":"071d8651-2a2d-4eed-9023-cfe636be09a0","Type":"ContainerStarted","Data":"e37479bda13beaf24311527eafecb516274243b704d6fb5081a97097ad473802"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.078451 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.103054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" event={"ID":"096d3786-85e8-4fe5-82b3-57cd1be251a1","Type":"ContainerStarted","Data":"4b483eb3df207667ce0ef08ecc9d5f04b00b1a650240b528c865cd0dd84c46c9"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.103092 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" event={"ID":"096d3786-85e8-4fe5-82b3-57cd1be251a1","Type":"ContainerStarted","Data":"126fe2722619d1ccfd5aab732a8d0103c0af471c906f8945e4a069432feb1124"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.126287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" event={"ID":"d1225410-7280-4409-8934-c6766eae5088","Type":"ContainerStarted","Data":"33593c468815b9c8e9f15ac21cf80464c097a31540fa561b5bb4e34d82d97c79"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.135101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.136843 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.636828937 +0000 UTC m=+226.638417176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.138657 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44062: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.149127 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cx59b" event={"ID":"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68","Type":"ContainerStarted","Data":"4ab7d6b85e3099e796ecabfa9fc2660711ba6296b5f8685afad4dadd2df7e123"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.154669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" event={"ID":"ade74420-c7a1-4b89-b6c8-7970d7b6c17c","Type":"ContainerStarted","Data":"ffd655709f699fd50969931ddb740298c0099a8571dae80aacbc1b11abed3487"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.163861 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" event={"ID":"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8","Type":"ContainerStarted","Data":"c5207dc2392a3957a9d88bb0f82c2a6de94b77565a714483cee8bf8ce748578b"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.163921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" event={"ID":"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8","Type":"ContainerStarted","Data":"fef57ee8d21a36c6a5d2018e3226c94863ce917c74665f33796c13791d23df10"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.163936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" event={"ID":"1bf3837f-dac8-49c7-9ac2-f5d6d0a087b8","Type":"ContainerStarted","Data":"6bbe0d469a3994ecb4a46add078ab4d0aa025076f703ba20434e200fc6c06c81"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.184124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerStarted","Data":"f80f9d0a69e3b6c8de8df5e105815c2ea6a5c4fed2a8e106511494e31c10c8bf"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.186268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" event={"ID":"9c7e70de-de85-421c-aaeb-476450d8e0ee","Type":"ContainerStarted","Data":"c742ce810138484734e421760d4b718ccb00b7e51da4d364477ac71eaf59de83"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.190373 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerStarted","Data":"3243f08e88e10cd81ef2c04e13602c3855a5f11779515760120bd35a4d40801a"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.197235 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerStarted","Data":"04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba"} Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.227691 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.234965 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44074: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.238615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.239554 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.739535588 +0000 UTC m=+226.741123827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.335313 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44082: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.349718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.350875 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.850843922 +0000 UTC m=+226.852432221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.430534 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44086: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.454371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.454818 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:51.954802622 +0000 UTC m=+226.956390851 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.534238 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44092: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.556775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.557170 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.057158504 +0000 UTC m=+227.058746733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.630495 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:51 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:51 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:51 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.630860 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.686503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.686882 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.186865239 +0000 UTC m=+227.188453478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.689649 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44096: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.749974 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44106: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.789586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.790138 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.290110662 +0000 UTC m=+227.291698901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.894846 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:51 crc kubenswrapper[4898]: E0313 13:59:51.895199 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.395182399 +0000 UTC m=+227.396770638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.916009 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44112: no serving certificate available for the kubelet" Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.920379 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-xglpf"] Mar 13 13:59:51 crc kubenswrapper[4898]: I0313 13:59:51.944265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-cvbms"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:51.997689 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:51.997972 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.497961501 +0000 UTC m=+227.499549740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.002602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rd22p"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.045998 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.086726 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.100739 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.100994 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.600968809 +0000 UTC m=+227.602557048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.149913 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.169242 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.179782 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hqcs6"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.195402 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podStartSLOduration=161.195379261 podStartE2EDuration="2m41.195379261s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.173072169 +0000 UTC m=+227.174660408" watchObservedRunningTime="2026-03-13 13:59:52.195379261 +0000 UTC m=+227.196967500" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.208385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.208795 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.708782111 +0000 UTC m=+227.710370350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.209312 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" podStartSLOduration=161.209297543 podStartE2EDuration="2m41.209297543s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.208105045 +0000 UTC m=+227.209693294" watchObservedRunningTime="2026-03-13 13:59:52.209297543 +0000 UTC m=+227.210885782" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.219642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerStarted","Data":"34583260d4b9f0b9859c8abe64e8c439dab719ff163e6f9eeb1d0488f02e5215"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.239414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerStarted","Data":"6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.239474 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerStarted","Data":"33fd4f5f9a06222767db7dc3489718fbb676a3336f67eacb714356ad05127307"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.241765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.247408 4898 generic.go:334] "Generic (PLEG): container finished" podID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerID="edc0dd60bc8dc83f763583bafafa796b1b5dd9cb2886beb15e2be0e325d957cf" exitCode=0 Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.247567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" event={"ID":"9c7e70de-de85-421c-aaeb-476450d8e0ee","Type":"ContainerDied","Data":"edc0dd60bc8dc83f763583bafafa796b1b5dd9cb2886beb15e2be0e325d957cf"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.252275 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vk7r4" podStartSLOduration=161.252254608 podStartE2EDuration="2m41.252254608s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.251573902 +0000 UTC m=+227.253162171" watchObservedRunningTime="2026-03-13 13:59:52.252254608 +0000 UTC m=+227.253842847" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.271993 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.272027 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.302256 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-xxjrs" podStartSLOduration=161.302240681 podStartE2EDuration="2m41.302240681s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.288132934 +0000 UTC m=+227.289721173" watchObservedRunningTime="2026-03-13 13:59:52.302240681 +0000 UTC m=+227.303828920" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.302664 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.302699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" event={"ID":"dfaa00dc-cff6-47b5-878f-886fab80071b","Type":"ContainerStarted","Data":"0aa2002536ab789e03e60f2e703aea476046c489cae1f4fd2e903b848b7d698a"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.309541 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.310184 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.81016519 +0000 UTC m=+227.811753429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.324610 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" event={"ID":"c8d8e11d-3717-47fd-a5c6-b8f52f19147b","Type":"ContainerStarted","Data":"b7cd43b9b282819ed33d91bde36d4d25f251bb880f318471fa92ff31a8352b62"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.324858 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" event={"ID":"c8d8e11d-3717-47fd-a5c6-b8f52f19147b","Type":"ContainerStarted","Data":"4e827400971be56892c4197993ef9f19ebab3fc7960d56bb3ff4253587e132b4"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.325862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerStarted","Data":"423e6617bc8cc210d91629b1d5580f9b4f8c3137b80892ad74315408fb41680c"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.341645 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" event={"ID":"2e6be656-c448-4b38-b5a8-2401ab767c54","Type":"ContainerStarted","Data":"9a11bce5f2e92e273dac618f990097c73c4a96c29ab227dcb58f4558e7ec86a8"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.342926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" event={"ID":"9af6faad-479e-481b-9f66-d074c1c20ce8","Type":"ContainerStarted","Data":"76978f1451b566f71fc2abbb7c343f7a61a83b6035e763754114ff95e11eb8d2"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.343248 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5jp6r" podStartSLOduration=162.343234429 podStartE2EDuration="2m42.343234429s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.340928104 +0000 UTC m=+227.342516353" watchObservedRunningTime="2026-03-13 13:59:52.343234429 +0000 UTC m=+227.344822668" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.346434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" event={"ID":"a2ea84ca-5ca6-432a-aa7d-c6350e0e52e8","Type":"ContainerStarted","Data":"23e98bcd837ca0cb8249dcc8ed51cabc9953f3983b564532cc25114bc763ae32"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.364238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerStarted","Data":"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.365337 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.367644 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.385475 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t4j9v" podStartSLOduration=161.385460256 podStartE2EDuration="2m41.385460256s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.384202316 +0000 UTC m=+227.385790555" watchObservedRunningTime="2026-03-13 13:59:52.385460256 +0000 UTC m=+227.387048495" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.397047 4898 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pvbpt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.397106 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.414555 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerStarted","Data":"a7c95316f7425af660b27292d15441df4c43eb94ff232136664abdd1d5a272eb"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.419578 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.420434 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:52.92042056 +0000 UTC m=+227.922008799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.422418 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" event={"ID":"55a96934-e740-402f-b4af-488a7eba53ae","Type":"ContainerStarted","Data":"3b3138220a23e9803eb1ca1a6b020a7b0dd6879719bc2fbe404eff2fe4efc939"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.449354 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mr499" event={"ID":"d1df7055-9dee-4cde-a787-bc18a276b777","Type":"ContainerStarted","Data":"7bd79891274d50826bbb04d76734841d3066293f61a67dc86d7cbbdd1dc480d3"} Mar 13 13:59:52 crc kubenswrapper[4898]: W0313 13:59:52.466056 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8675f0f0_7d3b_41d9_959e_e73f78f32c5c.slice/crio-f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e WatchSource:0}: Error finding container f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e: Status 404 returned error can't find the container with id f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.472003 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqcs6" event={"ID":"a0416bba-76e3-4312-94e0-ac5b77c6ace0","Type":"ContainerStarted","Data":"8561c0bc1283dbf759bd31dd3f74d1e87fff5cd64a0ea3c288390b3b3b05c12c"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.479581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" event={"ID":"ad924960-c3fd-4412-9b39-0723a598d86d","Type":"ContainerStarted","Data":"876f10ccbfb56710ccdb89c75bfa8467316fa88c4af75dc2b1cc0d366ea21ebb"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.480157 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6plhg" podStartSLOduration=161.480147285 podStartE2EDuration="2m41.480147285s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.438305917 +0000 UTC m=+227.439894156" watchObservedRunningTime="2026-03-13 13:59:52.480147285 +0000 UTC m=+227.481735524" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.482622 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtgq" podStartSLOduration=161.482613634 podStartE2EDuration="2m41.482613634s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.479254564 +0000 UTC m=+227.480842813" watchObservedRunningTime="2026-03-13 13:59:52.482613634 +0000 UTC m=+227.484201873" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.484016 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2ps4n"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.502491 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"5d7df65ed10cc02d6a22524199f9b24947355bb7c601bfddcebc3b5b19e1aa8f"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.503375 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.513318 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t2s2h" podStartSLOduration=161.513301376 podStartE2EDuration="2m41.513301376s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.512850425 +0000 UTC m=+227.514438674" watchObservedRunningTime="2026-03-13 13:59:52.513301376 +0000 UTC m=+227.514889605" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.513578 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn"] Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.518432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerStarted","Data":"13bb133bf7096639e00c031dd9fdfa46c4c4b70040b20821c04b022084397e7f"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.520553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-cx59b" event={"ID":"f4f26c0f-992a-4eb4-86d2-58e42a5b2b68","Type":"ContainerStarted","Data":"68f2d5a5e0268af93ce98f3742986642ab5cf81659e9919ba09dd20dc1ddf9a9"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.521433 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.522609 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.524535 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.024517804 +0000 UTC m=+228.026106043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.551706 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.552015 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.553082 4898 generic.go:334] "Generic (PLEG): container finished" podID="f446713d-03e3-461f-989f-eb6bdef32b30" containerID="240733896e8454525dba9569b24980e27aade8613f126f3a63438c9e9d8e7534" exitCode=0 Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.553337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerDied","Data":"240733896e8454525dba9569b24980e27aade8613f126f3a63438c9e9d8e7534"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.554406 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-9vdwm" podStartSLOduration=162.554386546 podStartE2EDuration="2m42.554386546s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.552581113 +0000 UTC m=+227.554169352" watchObservedRunningTime="2026-03-13 13:59:52.554386546 +0000 UTC m=+227.555974785" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.566744 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" event={"ID":"66ebc90f-88a0-476c-98d6-c595517196b3","Type":"ContainerStarted","Data":"68420b522d54c9a1d3e18efff552a67f7a9121eca607b63a478f1c66c056daff"} Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.578711 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podStartSLOduration=162.578693506 podStartE2EDuration="2m42.578693506s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.577110378 +0000 UTC m=+227.578698627" watchObservedRunningTime="2026-03-13 13:59:52.578693506 +0000 UTC m=+227.580281745" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.591298 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.628330 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:52 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:52 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:52 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.628394 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.628653 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44122: no serving certificate available for the kubelet" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.634866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.639323 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.139308352 +0000 UTC m=+228.140896591 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: W0313 13:59:52.676862 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6456920b_69b3_4ce9_9eaa_ad8e0fde2aa4.slice/crio-2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea WatchSource:0}: Error finding container 2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea: Status 404 returned error can't find the container with id 2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.687941 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podStartSLOduration=161.687920322 podStartE2EDuration="2m41.687920322s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.629657362 +0000 UTC m=+227.631245611" watchObservedRunningTime="2026-03-13 13:59:52.687920322 +0000 UTC m=+227.689508561" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.713978 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8rx7x" podStartSLOduration=161.713960713 podStartE2EDuration="2m41.713960713s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.689659824 +0000 UTC m=+227.691248073" watchObservedRunningTime="2026-03-13 13:59:52.713960713 +0000 UTC m=+227.715548952" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.735974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.737127 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.237087865 +0000 UTC m=+228.238676104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.749506 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" podStartSLOduration=161.749486951 podStartE2EDuration="2m41.749486951s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.745183578 +0000 UTC m=+227.746771817" watchObservedRunningTime="2026-03-13 13:59:52.749486951 +0000 UTC m=+227.751075190" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.770639 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-cx59b" podStartSLOduration=161.770620065 podStartE2EDuration="2m41.770620065s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.770365739 +0000 UTC m=+227.771953988" watchObservedRunningTime="2026-03-13 13:59:52.770620065 +0000 UTC m=+227.772208304" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.801066 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mr499" podStartSLOduration=5.801048971 podStartE2EDuration="5.801048971s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:52.79891171 +0000 UTC m=+227.800499969" watchObservedRunningTime="2026-03-13 13:59:52.801048971 +0000 UTC m=+227.802637210" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.838256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.840601 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.340585564 +0000 UTC m=+228.342173803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.869002 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" Mar 13 13:59:52 crc kubenswrapper[4898]: I0313 13:59:52.947221 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:52 crc kubenswrapper[4898]: E0313 13:59:52.947608 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.447593017 +0000 UTC m=+228.449181256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.048593 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.049227 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.549196781 +0000 UTC m=+228.550785020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.150841 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.151508 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.651476962 +0000 UTC m=+228.653065211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.151619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.152495 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.652487056 +0000 UTC m=+228.654075295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.253061 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.253367 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.753343242 +0000 UTC m=+228.754931481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.262768 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.303312 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.354691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.355162 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.855144671 +0000 UTC m=+228.856732980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.455845 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.456221 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:53.956184601 +0000 UTC m=+228.957772840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.557648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.558205 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.05800274 +0000 UTC m=+229.059590969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.614766 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:53 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:53 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:53 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.615051 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.642503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" event={"ID":"2e6be656-c448-4b38-b5a8-2401ab767c54","Type":"ContainerStarted","Data":"a1c32abd97dd846ca3e993c5a91ad71bd349858f81e83b81092a61a9354172a2"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.645385 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerStarted","Data":"f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.654072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ps4n" event={"ID":"22cb0051-a6f4-4790-b51c-3da149327edd","Type":"ContainerStarted","Data":"59364e2f79d4987c7022bfb3575d49bc1b133d0063ed88f1f04f0b961f1501d9"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.654124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2ps4n" event={"ID":"22cb0051-a6f4-4790-b51c-3da149327edd","Type":"ContainerStarted","Data":"10eb5bf6ae08d26f1b282a1735794b2c96f9f7f0e680a841fecd1ac22fc191c0"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.660975 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.661369 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.161351256 +0000 UTC m=+229.162939495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.665122 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xwdnn" podStartSLOduration=162.665106566 podStartE2EDuration="2m42.665106566s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.663863616 +0000 UTC m=+228.665451855" watchObservedRunningTime="2026-03-13 13:59:53.665106566 +0000 UTC m=+228.666694805" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.728642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerStarted","Data":"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.729493 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.730395 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2ps4n" podStartSLOduration=6.730380113 podStartE2EDuration="6.730380113s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.729683647 +0000 UTC m=+228.731271886" watchObservedRunningTime="2026-03-13 13:59:53.730380113 +0000 UTC m=+228.731968352" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.736908 4898 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-djn5q container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.38:6443/healthz\": dial tcp 10.217.0.38:6443: connect: connection refused" start-of-body= Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.736969 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.38:6443/healthz\": dial tcp 10.217.0.38:6443: connect: connection refused" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.758889 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" podStartSLOduration=162.758867783 podStartE2EDuration="2m42.758867783s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.756713861 +0000 UTC m=+228.758302110" watchObservedRunningTime="2026-03-13 13:59:53.758867783 +0000 UTC m=+228.760456022" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.761090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqcs6" event={"ID":"a0416bba-76e3-4312-94e0-ac5b77c6ace0","Type":"ContainerStarted","Data":"d938659f66c504e3a53f10404f4cb9b8a7f81da245338b7fc04aa817f3df0247"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.763792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.765053 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.26503464 +0000 UTC m=+229.266622879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.774800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerStarted","Data":"533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.775755 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.808751 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.845209 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" podStartSLOduration=163.845196142 podStartE2EDuration="2m43.845196142s" podCreationTimestamp="2026-03-13 13:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.844124497 +0000 UTC m=+228.845712736" watchObservedRunningTime="2026-03-13 13:59:53.845196142 +0000 UTC m=+228.846784381" Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.847281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerStarted","Data":"3d78f2eae4339faf9da85d3605b5240a3df3c676cc009629b7c0efcf3ad06e0b"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.866271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" event={"ID":"ad924960-c3fd-4412-9b39-0723a598d86d","Type":"ContainerStarted","Data":"6101b0b070abdd17359194fe8ea3f377c21f03dac740dfad3caca468dc4d3c9e"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.868581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.869772 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.369737798 +0000 UTC m=+229.371326037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.913509 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a01ab05-7178-48c7-892b-b91cf60432f8" containerID="ea0552ede8b56b12eb135cbe5901bf93d8c601f35fa5916a4dd9fe23a332df86" exitCode=0 Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.913614 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" event={"ID":"1a01ab05-7178-48c7-892b-b91cf60432f8","Type":"ContainerDied","Data":"ea0552ede8b56b12eb135cbe5901bf93d8c601f35fa5916a4dd9fe23a332df86"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.933195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" event={"ID":"dfaa00dc-cff6-47b5-878f-886fab80071b","Type":"ContainerStarted","Data":"d17d5d04b082cf9325120049501e4885bddbd9a55bc9df7ea4ad2f54754d8fc5"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.964911 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" event={"ID":"d1225410-7280-4409-8934-c6766eae5088","Type":"ContainerStarted","Data":"e3d55e6af58dc24c9eb37bd3f1da62cc7fa0169219865d3d83df1e09ea590f59"} Mar 13 13:59:53 crc kubenswrapper[4898]: I0313 13:59:53.973739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:53 crc kubenswrapper[4898]: E0313 13:59:53.974062 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.474051457 +0000 UTC m=+229.475639696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.010945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerStarted","Data":"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.011192 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerStarted","Data":"5ce4caec01bc9ee8df0b59f3f0251f9037b82e485a55597652071608caca296b"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.011942 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.023360 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8r99 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.023413 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.058534 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podStartSLOduration=163.058517582 podStartE2EDuration="2m43.058517582s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:53.880564596 +0000 UTC m=+228.882152845" watchObservedRunningTime="2026-03-13 13:59:54.058517582 +0000 UTC m=+229.060105821" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.061650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerStarted","Data":"f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.062438 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.074571 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.075498 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.575482407 +0000 UTC m=+229.577070646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.109923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" event={"ID":"9af6faad-479e-481b-9f66-d074c1c20ce8","Type":"ContainerStarted","Data":"bfffbc1d47cf562161ad401b19d456f5803628cdc3e9b93364d303051da7fdc5"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.109965 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" event={"ID":"9af6faad-479e-481b-9f66-d074c1c20ce8","Type":"ContainerStarted","Data":"add114cee616c51a7db53d7f91b4b937bf93df887dadd93903309e7753a0dcc6"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.120032 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mrr5j" podStartSLOduration=163.120017719 podStartE2EDuration="2m43.120017719s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.057337274 +0000 UTC m=+229.058925523" watchObservedRunningTime="2026-03-13 13:59:54.120017719 +0000 UTC m=+229.121605948" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.121941 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.123730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mr499" event={"ID":"d1df7055-9dee-4cde-a787-bc18a276b777","Type":"ContainerStarted","Data":"132a21d8d3ccecb68f50362d993af69d42f4875b110ef71f981d6813b55a54b3"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.156733 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44132: no serving certificate available for the kubelet" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.162114 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" event={"ID":"f794406f-fc28-4e2f-953d-ab45e36cc754","Type":"ContainerStarted","Data":"4d3e29597b67d8aa33de474d704625dc02977b6f8f28478f0d39a71a3fa00e40"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.162156 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" event={"ID":"f794406f-fc28-4e2f-953d-ab45e36cc754","Type":"ContainerStarted","Data":"1f5592600f897f3e06e6f8ad157f32be4832808e055795a6024b5a5e83426cec"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.162165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" event={"ID":"f794406f-fc28-4e2f-953d-ab45e36cc754","Type":"ContainerStarted","Data":"39769e11acaa214e87052d1bb5ad3d3b71dc8c697697151a19a7d0822241a4f3"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.167221 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podStartSLOduration=163.167206385 podStartE2EDuration="2m43.167206385s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.145344293 +0000 UTC m=+229.146932532" watchObservedRunningTime="2026-03-13 13:59:54.167206385 +0000 UTC m=+229.168794614" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.177674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.178504 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.678493704 +0000 UTC m=+229.680081943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.180409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" event={"ID":"6444bf97-84ef-49df-afcd-4e939a5de2ad","Type":"ContainerStarted","Data":"e724d5addb1cd6dacb56928656961ec003b4cee18b292aff9da189cba7cb8b7f"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.199630 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z5r8j" podStartSLOduration=163.199613418 podStartE2EDuration="2m43.199613418s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.168379953 +0000 UTC m=+229.169968192" watchObservedRunningTime="2026-03-13 13:59:54.199613418 +0000 UTC m=+229.201201657" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.200880 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-cvbms" podStartSLOduration=163.200873318 podStartE2EDuration="2m43.200873318s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.1984477 +0000 UTC m=+229.200035959" watchObservedRunningTime="2026-03-13 13:59:54.200873318 +0000 UTC m=+229.202461557" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.217785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" event={"ID":"c8d8e11d-3717-47fd-a5c6-b8f52f19147b","Type":"ContainerStarted","Data":"09fe000f8256c50a8bbef94a79acc7883412066f8bca0684a09af0e3a280eab6"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.249374 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" event={"ID":"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4","Type":"ContainerStarted","Data":"2160c8e1fb1829a683300b17998f45a9fce220d710c9e5e5237b085cb28cf8ea"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.283789 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.284871 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.784856932 +0000 UTC m=+229.786445171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.325194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerStarted","Data":"5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.327811 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nqrmd" podStartSLOduration=163.327797926 podStartE2EDuration="2m43.327797926s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.27722418 +0000 UTC m=+229.278812419" watchObservedRunningTime="2026-03-13 13:59:54.327797926 +0000 UTC m=+229.329386165" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.336236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" event={"ID":"071d8651-2a2d-4eed-9023-cfe636be09a0","Type":"ContainerStarted","Data":"bd32fe2cef365eaaed01cc9a800c7901ae0b0ae4565b3977e2bf9b20a1257ddb"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.336281 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" event={"ID":"071d8651-2a2d-4eed-9023-cfe636be09a0","Type":"ContainerStarted","Data":"8a68531c4432e244d55d51c54210fb9cf9a046d91a04990d04c320d1d1ca0793"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.339952 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" event={"ID":"9c7e70de-de85-421c-aaeb-476450d8e0ee","Type":"ContainerStarted","Data":"ccd7403e1f9b432e946d026a2dfcc99e0f3285803db614b12fc60317b1f5fb3e"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.340416 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.341338 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" event={"ID":"66ebc90f-88a0-476c-98d6-c595517196b3","Type":"ContainerStarted","Data":"0f877df58d3f631f3764abf0d3113bee627f9006e57760ab88188bf736ee1928"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.342589 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" event={"ID":"8675f0f0-7d3b-41d9-959e-e73f78f32c5c","Type":"ContainerStarted","Data":"f9fd46726abc0faa9f794ff1133bafe631cc262546158876056b7a4740033c49"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.342607 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" event={"ID":"8675f0f0-7d3b-41d9-959e-e73f78f32c5c","Type":"ContainerStarted","Data":"f535ac1d451a11e6e21776c440e07c6e0112ff22847a6ab7095f38893121a79e"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.342942 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.359878 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c6rz7" podStartSLOduration=163.359862641 podStartE2EDuration="2m43.359862641s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.327927179 +0000 UTC m=+229.329515428" watchObservedRunningTime="2026-03-13 13:59:54.359862641 +0000 UTC m=+229.361450880" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.369290 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" event={"ID":"55a96934-e740-402f-b4af-488a7eba53ae","Type":"ContainerStarted","Data":"9c5eb8d0b347ab2a909e923065a6b311c93560fd69fbc82c9cff1576bbb6917e"} Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.376305 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.376350 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.384072 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qt7gm" podStartSLOduration=163.384055728 podStartE2EDuration="2m43.384055728s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.361002268 +0000 UTC m=+229.362590517" watchObservedRunningTime="2026-03-13 13:59:54.384055728 +0000 UTC m=+229.385643967" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.385312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.386769 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.886758633 +0000 UTC m=+229.888346872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.387168 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podStartSLOduration=163.387157282 podStartE2EDuration="2m43.387157282s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.383575537 +0000 UTC m=+229.385163786" watchObservedRunningTime="2026-03-13 13:59:54.387157282 +0000 UTC m=+229.388745521" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.435256 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.442512 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-98qz7" podStartSLOduration=163.442492043 podStartE2EDuration="2m43.442492043s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.442470282 +0000 UTC m=+229.444058521" watchObservedRunningTime="2026-03-13 13:59:54.442492043 +0000 UTC m=+229.444080272" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.482841 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-7l2pm" podStartSLOduration=163.482827455 podStartE2EDuration="2m43.482827455s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.482266812 +0000 UTC m=+229.483855061" watchObservedRunningTime="2026-03-13 13:59:54.482827455 +0000 UTC m=+229.484415694" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.490430 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.490541 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.990525009 +0000 UTC m=+229.992113248 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.490768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.492132 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.496795 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:54.996777088 +0000 UTC m=+229.998365407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.529190 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" podStartSLOduration=163.529166961 podStartE2EDuration="2m43.529166961s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.508791364 +0000 UTC m=+229.510379593" watchObservedRunningTime="2026-03-13 13:59:54.529166961 +0000 UTC m=+229.530755200" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.553117 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podStartSLOduration=163.553099731 podStartE2EDuration="2m43.553099731s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.550572621 +0000 UTC m=+229.552160890" watchObservedRunningTime="2026-03-13 13:59:54.553099731 +0000 UTC m=+229.554687970" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.588980 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-m2ntx" podStartSLOduration=163.588963887 podStartE2EDuration="2m43.588963887s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.58740961 +0000 UTC m=+229.588997859" watchObservedRunningTime="2026-03-13 13:59:54.588963887 +0000 UTC m=+229.590552126" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.603522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.603853 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.103837992 +0000 UTC m=+230.105426231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.620629 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:54 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:54 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:54 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.620698 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.666166 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-xjx7v" podStartSLOduration=163.666151849 podStartE2EDuration="2m43.666151849s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.628774207 +0000 UTC m=+229.630362446" watchObservedRunningTime="2026-03-13 13:59:54.666151849 +0000 UTC m=+229.667740088" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.696722 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podStartSLOduration=163.696709118 podStartE2EDuration="2m43.696709118s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.695265513 +0000 UTC m=+229.696853752" watchObservedRunningTime="2026-03-13 13:59:54.696709118 +0000 UTC m=+229.698297357" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.697426 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" podStartSLOduration=163.697421755 podStartE2EDuration="2m43.697421755s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:54.666228981 +0000 UTC m=+229.667817220" watchObservedRunningTime="2026-03-13 13:59:54.697421755 +0000 UTC m=+229.699009994" Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.705014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.705564 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.205532328 +0000 UTC m=+230.207120567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.806634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.806993 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.306972498 +0000 UTC m=+230.308560737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:54 crc kubenswrapper[4898]: I0313 13:59:54.908536 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:54 crc kubenswrapper[4898]: E0313 13:59:54.908860 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.408848788 +0000 UTC m=+230.410437027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.010377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.010733 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.510718258 +0000 UTC m=+230.512306497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.111745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.112065 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.612051446 +0000 UTC m=+230.613639685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.212730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.212891 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.712869371 +0000 UTC m=+230.714457610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.212960 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.213274 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.713266421 +0000 UTC m=+230.714854650 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.313964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.314148 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.814123217 +0000 UTC m=+230.815711456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.314260 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.314557 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.814544867 +0000 UTC m=+230.816133156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.396556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" event={"ID":"8675f0f0-7d3b-41d9-959e-e73f78f32c5c","Type":"ContainerStarted","Data":"df17383199ef22fb611b55de12da7719f69c6330cdc883aedd63b68d3ea9f8af"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.403215 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"d188a49a89641237218187334ef909cdfab8bca3de66a1f8cca460d12d796c34"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.405250 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-lwzpn" event={"ID":"6456920b-69b3-4ce9-9eaa-ad8e0fde2aa4","Type":"ContainerStarted","Data":"9ebdc21a89c7e542370d6c291c23a4de0ed142f28fe324f890633805c6eafc8e"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.415119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.415294 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:55.91527509 +0000 UTC m=+230.916863329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.439786 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-xglpf" event={"ID":"55a96934-e740-402f-b4af-488a7eba53ae","Type":"ContainerStarted","Data":"642f2f07742827ed95ef31168363d6545f55661dce6a87f6018188393f80a952"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.454582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" event={"ID":"f446713d-03e3-461f-989f-eb6bdef32b30","Type":"ContainerStarted","Data":"f099af6c5b50a4be197d635f5850b6183333a053a6af427c07f76da96d5e7c7b"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.465470 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqcs6" event={"ID":"a0416bba-76e3-4312-94e0-ac5b77c6ace0","Type":"ContainerStarted","Data":"c522c92b343ad9fc99d411c2d428c314bb922018994e10fce74dbb04fb415613"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.466035 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hqcs6" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.480165 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" event={"ID":"1a01ab05-7178-48c7-892b-b91cf60432f8","Type":"ContainerStarted","Data":"454ed8ccf9bcb2dc3908edb26a397efc55c2e238824780ec87ba7d56db753dbe"} Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.481348 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p8r99 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.481400 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.481798 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" containerID="cri-o://4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" gracePeriod=30 Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.482864 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.482921 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.482959 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" containerID="cri-o://4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" gracePeriod=30 Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.500789 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.516863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.517280 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.017265424 +0000 UTC m=+231.018853673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.587428 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" podStartSLOduration=164.587410287 podStartE2EDuration="2m44.587410287s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:55.572646785 +0000 UTC m=+230.574235034" watchObservedRunningTime="2026-03-13 13:59:55.587410287 +0000 UTC m=+230.588998526" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.588954 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" podStartSLOduration=164.588947984 podStartE2EDuration="2m44.588947984s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:55.529451514 +0000 UTC m=+230.531039763" watchObservedRunningTime="2026-03-13 13:59:55.588947984 +0000 UTC m=+230.590536223" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.608158 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hqcs6" podStartSLOduration=8.608141012 podStartE2EDuration="8.608141012s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:55.606845151 +0000 UTC m=+230.608433400" watchObservedRunningTime="2026-03-13 13:59:55.608141012 +0000 UTC m=+230.609729251" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.614968 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:55 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:55 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:55 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.615005 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.619732 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.619862 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.119835681 +0000 UTC m=+231.121423920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.620037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.623174 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.12316314 +0000 UTC m=+231.124751379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.722572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.722965 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.222949811 +0000 UTC m=+231.224538050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.823588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.823915 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.323889679 +0000 UTC m=+231.325477918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.883862 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.884749 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.895572 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.907176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927607 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: I0313 13:59:55.927644 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:55 crc kubenswrapper[4898]: E0313 13:59:55.927734 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.427721576 +0000 UTC m=+231.429309815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.039929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.040195 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.540177329 +0000 UTC m=+231.541765568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.040602 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.040656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.066217 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.068253 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.070163 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.079763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.086866 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"community-operators-twh8h\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140204 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140390 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140460 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.140487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.140568 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.640554454 +0000 UTC m=+231.642142693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.157934 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.165001 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.195633 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.195811 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.195823 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.195835 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.195843 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.196221 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" containerName="route-controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.196246 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" containerName="controller-manager" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.196562 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.217224 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.236157 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241053 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241096 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241190 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241220 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241254 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") pod \"1607f924-1e24-4848-b811-21ac3a7f8999\" (UID: \"1607f924-1e24-4848-b811-21ac3a7f8999\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241365 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") pod \"4372b422-23c7-46bc-aec4-aef665acbda1\" (UID: \"4372b422-23c7-46bc-aec4-aef665acbda1\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241463 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241488 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241526 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241544 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241576 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241637 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.241986 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.242532 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca" (OuterVolumeSpecName: "client-ca") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.244787 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca" (OuterVolumeSpecName: "client-ca") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.244785 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config" (OuterVolumeSpecName: "config") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.244817 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.245096 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.745083488 +0000 UTC m=+231.746671727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.245450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.245831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config" (OuterVolumeSpecName: "config") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.251683 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.251779 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk" (OuterVolumeSpecName: "kube-api-access-bkqfk") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "kube-api-access-bkqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.262784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689" (OuterVolumeSpecName: "kube-api-access-vv689") pod "4372b422-23c7-46bc-aec4-aef665acbda1" (UID: "4372b422-23c7-46bc-aec4-aef665acbda1"). InnerVolumeSpecName "kube-api-access-vv689". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.262847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1607f924-1e24-4848-b811-21ac3a7f8999" (UID: "1607f924-1e24-4848-b811-21ac3a7f8999"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.268938 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"certified-operators-dvvz2\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.269944 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.270744 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.288273 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.342975 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343085 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343102 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343137 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343149 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkqfk\" (UniqueName: \"kubernetes.io/projected/1607f924-1e24-4848-b811-21ac3a7f8999-kube-api-access-bkqfk\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343160 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343170 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343179 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1607f924-1e24-4848-b811-21ac3a7f8999-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343186 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv689\" (UniqueName: \"kubernetes.io/projected/4372b422-23c7-46bc-aec4-aef665acbda1-kube-api-access-vv689\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343194 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4372b422-23c7-46bc-aec4-aef665acbda1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343204 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1607f924-1e24-4848-b811-21ac3a7f8999-config\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.343212 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4372b422-23c7-46bc-aec4-aef665acbda1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.343561 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.843546717 +0000 UTC m=+231.845134956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.344291 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.344479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.348233 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.370583 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"route-controller-manager-779788b65f-vkvqq\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.440415 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.444987 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.445049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.445075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.445106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.445463 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:56.945449658 +0000 UTC m=+231.947037897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.446139 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.446319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.470790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"community-operators-ppq6v\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.480938 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.482022 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.489514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.521248 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.545322 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546121 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546359 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.546399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.546520 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.046506169 +0000 UTC m=+232.048094408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.614154 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631685 4898 generic.go:334] "Generic (PLEG): container finished" podID="1607f924-1e24-4848-b811-21ac3a7f8999" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" exitCode=0 Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631739 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerDied","Data":"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631802 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" event={"ID":"1607f924-1e24-4848-b811-21ac3a7f8999","Type":"ContainerDied","Data":"ea436643cbf70a5d9613211ff3822168c4bccbc29674c27f320c9a9a2e6fcdbf"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631801 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.631824 4898 scope.go:117] "RemoveContainer" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.635973 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:56 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:56 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:56 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.636039 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.644430 4898 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.649992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.650076 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.650139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.650377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.651434 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.151419082 +0000 UTC m=+232.153007321 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.651804 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.653166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.672503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"7edbe6333ef083f7ac085e082e2b761601f88273dd2bfb4af58267d359fc3a4d"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.672766 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"0e43a3819750890872b613d51ec964234dbb52d5010a001f26b9ad677d458ca0"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.679998 4898 generic.go:334] "Generic (PLEG): container finished" podID="4372b422-23c7-46bc-aec4-aef665acbda1" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" exitCode=0 Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680270 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680626 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerDied","Data":"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pvbpt" event={"ID":"4372b422-23c7-46bc-aec4-aef665acbda1","Type":"ContainerDied","Data":"ee22c587d8deac6e60488ead6927720218bbd5dcec6d2870cf91f10d1559c75e"} Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.680631 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"certified-operators-xh84s\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.708561 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.716818 4898 scope.go:117] "RemoveContainer" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.717304 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e\": container with ID starting with 4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e not found: ID does not exist" containerID="4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.717371 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e"} err="failed to get container status \"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e\": rpc error: code = NotFound desc = could not find container \"4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e\": container with ID starting with 4b7bee413921e7c7eec939e46b8977335a1baead35b676d6082b18bb10857c3e not found: ID does not exist" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.717545 4898 scope.go:117] "RemoveContainer" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.740821 4898 scope.go:117] "RemoveContainer" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.742841 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9\": container with ID starting with 4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9 not found: ID does not exist" containerID="4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.742961 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9"} err="failed to get container status \"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9\": rpc error: code = NotFound desc = could not find container \"4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9\": container with ID starting with 4b1fbbf5e660b92a0c53b78de1566661ef67eefec09d4d3752584910bdc778c9 not found: ID does not exist" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.750867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.752417 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.252402131 +0000 UTC m=+232.253990370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.759630 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.765455 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pvbpt"] Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.772107 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1607f924_1e24_4848_b811_21ac3a7f8999.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1607f924_1e24_4848_b811_21ac3a7f8999.slice/crio-ea436643cbf70a5d9613211ff3822168c4bccbc29674c27f320c9a9a2e6fcdbf\": RecentStats: unable to find data in memory cache]" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.772878 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.772996 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gwvk4"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.785019 4898 ???:1] "http: TLS handshake error from 192.168.126.11:44142: no serving certificate available for the kubelet" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.803986 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.833121 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.859099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.859504 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.359490506 +0000 UTC m=+232.361078745 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6n228" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.956956 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 13:59:56 crc kubenswrapper[4898]: I0313 13:59:56.959765 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:56 crc kubenswrapper[4898]: E0313 13:59:56.960168 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 13:59:57.460151258 +0000 UTC m=+232.461739497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.002979 4898 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T13:59:56.644446516Z","Handler":null,"Name":""} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.006653 4898 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.006688 4898 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.019838 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.043052 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.061265 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.064629 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.064669 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.073517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 13:59:57 crc kubenswrapper[4898]: W0313 13:59:57.084789 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae77efc_55ca_4eee_8817_9c21d0bafa6e.slice/crio-45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9 WatchSource:0}: Error finding container 45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9: Status 404 returned error can't find the container with id 45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.120026 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6n228\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.162878 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.171461 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.201681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.423602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.613623 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:57 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:57 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:57 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.613889 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.704485 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.704547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.704570 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerStarted","Data":"81fb34feaf2adf00d5d07da217b484c8e9d6cdeb7a039901668613864eddf170"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.710368 4898 generic.go:334] "Generic (PLEG): container finished" podID="43acaee8-efc8-4156-b28c-b493f241ac53" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.710529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.710560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerStarted","Data":"6d70382f54646dad1c6a01020a09851e8f00eda076ad91d5aba2e586ae668444"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.712133 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerID="359074a54fdd2abe01e5471c8009872f5ca05eb132b157ad005435e3bc55c0f9" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.712181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"359074a54fdd2abe01e5471c8009872f5ca05eb132b157ad005435e3bc55c0f9"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.712201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerStarted","Data":"45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.715502 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" event={"ID":"41000ce4-1a84-44de-b283-1fe0350b1c17","Type":"ContainerStarted","Data":"3613fad8930f3f95230fbc9488748abf70ebc3c0e1bf2d4df0f2884fc85bed77"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.717498 4898 generic.go:334] "Generic (PLEG): container finished" podID="a990881e-0caf-4096-a372-4cdad69006c1" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" exitCode=0 Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.717549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.717569 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerStarted","Data":"3acac09dab7fc6e01d8b6bf7a368fc3881544da372e2f3a95826c1fc007510c2"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.721207 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerStarted","Data":"a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.721236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerStarted","Data":"a9e681aa51d97234d007d99a849ac6425a4e895b1487edcb5c9a6fe14935144f"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.722453 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.728768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerStarted","Data":"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.728810 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerStarted","Data":"cb30f09f65c6668eae49d8e2a5f1518ff1c19e2eb8fcc21bf1f743165319e716"} Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.746461 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podStartSLOduration=4.746443417 podStartE2EDuration="4.746443417s" podCreationTimestamp="2026-03-13 13:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:57.743698882 +0000 UTC m=+232.745287141" watchObservedRunningTime="2026-03-13 13:59:57.746443417 +0000 UTC m=+232.748031656" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.748996 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1607f924-1e24-4848-b811-21ac3a7f8999" path="/var/lib/kubelet/pods/1607f924-1e24-4848-b811-21ac3a7f8999/volumes" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.749609 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4372b422-23c7-46bc-aec4-aef665acbda1" path="/var/lib/kubelet/pods/4372b422-23c7-46bc-aec4-aef665acbda1/volumes" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.750226 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.789724 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podStartSLOduration=10.78970996 podStartE2EDuration="10.78970996s" podCreationTimestamp="2026-03-13 13:59:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:57.788358407 +0000 UTC m=+232.789946646" watchObservedRunningTime="2026-03-13 13:59:57.78970996 +0000 UTC m=+232.791298199" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.810459 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 13:59:57 crc kubenswrapper[4898]: I0313 13:59:57.848404 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" podStartSLOduration=166.848383299 podStartE2EDuration="2m46.848383299s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:57.825283018 +0000 UTC m=+232.826871267" watchObservedRunningTime="2026-03-13 13:59:57.848383299 +0000 UTC m=+232.849971538" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.068313 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.069207 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.070878 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.083265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.183877 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.184019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.184054 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.285621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.285672 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.285688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.286457 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.286520 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.306721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"redhat-marketplace-h97c9\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.425964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.470441 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.471988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.498765 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.617775 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:58 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:58 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:58 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.617833 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.622243 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.622298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.622342 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.688026 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.688729 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.691405 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.691456 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.697330 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.723588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.723979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.724035 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.724667 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.724679 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.741274 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.747564 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"redhat-marketplace-hn9sl\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.761269 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.762052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.768422 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.768606 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.768864 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.769121 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.769171 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.769486 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.774120 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.776946 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.824863 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.824967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.838242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927311 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927380 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927452 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927508 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927615 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.927773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.931763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 13:59:58 crc kubenswrapper[4898]: I0313 13:59:58.953629 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:58 crc kubenswrapper[4898]: W0313 13:59:58.963169 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85f72a8_3887_4867_8a9c_649992ce23f1.slice/crio-da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc WatchSource:0}: Error finding container da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc: Status 404 returned error can't find the container with id da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.019772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029803 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029928 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.029970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.030008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.031032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.032112 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.032759 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.046478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.049237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"controller-manager-6dc964fb55-scb8h\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.067850 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.068931 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.072535 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.076188 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.078364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.168918 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.232076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.232136 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.232201 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.334189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.334734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.334774 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.336283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.337596 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.367281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"redhat-operators-974qp\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.385970 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.406425 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 13:59:59 crc kubenswrapper[4898]: W0313 13:59:59.409161 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b236b43_7ef1_4447_9182_2a37ee70fb95.slice/crio-fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd WatchSource:0}: Error finding container fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd: Status 404 returned error can't find the container with id fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.457504 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.473885 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.475119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.487664 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.610370 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.614286 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 13:59:59 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 13:59:59 crc kubenswrapper[4898]: [+]process-running ok Mar 13 13:59:59 crc kubenswrapper[4898]: healthz check failed Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.614322 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.643653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.643722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.643964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.651425 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.651497 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.670308 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.744638 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.744927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.744982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.745657 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.746504 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.765650 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"redhat-operators-btkxt\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.771209 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerStarted","Data":"fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.779479 4898 generic.go:334] "Generic (PLEG): container finished" podID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" exitCode=0 Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.779619 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.779646 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerStarted","Data":"0b8d238e1855df1df599d5c20b2f8c47368ca041ea02bd9d799ff8595124e451"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.809003 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerStarted","Data":"b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.809048 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerStarted","Data":"bed30031d55363e7b3a1c1f5fd11eab1b82568e99697e58e8ebc0b77de6db828"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.809521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.812497 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.813062 4898 patch_prober.go:28] interesting pod/controller-manager-6dc964fb55-scb8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.813099 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.814388 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" exitCode=0 Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.815591 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.815623 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerStarted","Data":"da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc"} Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.821100 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-z5vf2" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.827291 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podStartSLOduration=6.827275061 podStartE2EDuration="6.827275061s" podCreationTimestamp="2026-03-13 13:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 13:59:59.826710677 +0000 UTC m=+234.828298926" watchObservedRunningTime="2026-03-13 13:59:59.827275061 +0000 UTC m=+234.828863300" Mar 13 13:59:59 crc kubenswrapper[4898]: I0313 13:59:59.869343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101099 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101404 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101134 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.101768 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.134412 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.134642 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" containerID="cri-o://f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4" gracePeriod=30 Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.141976 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.142680 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.142701 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.142827 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.144615 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.145242 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.146020 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.149731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.155114 4898 patch_prober.go:28] interesting pod/console-f9d7485db-7l2pm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.155181 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7l2pm" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.156164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.156802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.171744 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.179253 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.264821 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.265025 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"auto-csr-approver-29556840-vmqqn\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.265068 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.265156 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"auto-csr-approver-29556840-vmqqn\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.367829 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.369202 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.378681 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.392777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"auto-csr-approver-29556840-vmqqn\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.427019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"collect-profiles-29556840-sfz5h\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.478269 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.494251 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.598224 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.600194 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.603287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.609522 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.626281 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.629208 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:00 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:00 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:00 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.629271 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.672625 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.672667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.774700 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.774743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.774845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.802729 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.858594 4898 generic.go:334] "Generic (PLEG): container finished" podID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerID="f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4" exitCode=0 Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.858668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerDied","Data":"f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4"} Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.862015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerStarted","Data":"3b906d35fdabd57736b30e79e8733ee02d43842fe161c7f4d822cd34e8ed4f5a"} Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.870594 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.871475 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" Mar 13 14:00:00 crc kubenswrapper[4898]: I0313 14:00:00.923772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.006847 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.006824703 podStartE2EDuration="3.006824703s" podCreationTimestamp="2026-03-13 13:59:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:01.003541804 +0000 UTC m=+236.005130043" watchObservedRunningTime="2026-03-13 14:00:01.006824703 +0000 UTC m=+236.008412942" Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.617373 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:01 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:01 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:01 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.617443 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.892258 4898 generic.go:334] "Generic (PLEG): container finished" podID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerID="3b906d35fdabd57736b30e79e8733ee02d43842fe161c7f4d822cd34e8ed4f5a" exitCode=0 Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.893096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerDied","Data":"3b906d35fdabd57736b30e79e8733ee02d43842fe161c7f4d822cd34e8ed4f5a"} Mar 13 14:00:01 crc kubenswrapper[4898]: I0313 14:00:01.934536 4898 ???:1] "http: TLS handshake error from 192.168.126.11:49624: no serving certificate available for the kubelet" Mar 13 14:00:02 crc kubenswrapper[4898]: I0313 14:00:02.612050 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:02 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:02 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:02 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:02 crc kubenswrapper[4898]: I0313 14:00:02.612136 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:03 crc kubenswrapper[4898]: I0313 14:00:03.612421 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:03 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:03 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:03 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:03 crc kubenswrapper[4898]: I0313 14:00:03.612488 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:04 crc kubenswrapper[4898]: I0313 14:00:04.612435 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:04 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:04 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:04 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:04 crc kubenswrapper[4898]: I0313 14:00:04.612502 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.474429 4898 ???:1] "http: TLS handshake error from 192.168.126.11:49636: no serving certificate available for the kubelet" Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.511689 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hqcs6" Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.612950 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:05 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:05 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:05 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:05 crc kubenswrapper[4898]: I0313 14:00:05.613017 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:06 crc kubenswrapper[4898]: I0313 14:00:06.612957 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:06 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:06 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:06 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:06 crc kubenswrapper[4898]: I0313 14:00:06.613314 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:07 crc kubenswrapper[4898]: W0313 14:00:07.499088 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183d86e9_cd5c_45ed_a460_bb6169e07c72.slice/crio-3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7 WatchSource:0}: Error finding container 3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7: Status 404 returned error can't find the container with id 3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7 Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.537010 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.613223 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:07 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:07 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:07 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.613290 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.676168 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") pod \"0b236b43-7ef1-4447-9182-2a37ee70fb95\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.676372 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") pod \"0b236b43-7ef1-4447-9182-2a37ee70fb95\" (UID: \"0b236b43-7ef1-4447-9182-2a37ee70fb95\") " Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.676873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0b236b43-7ef1-4447-9182-2a37ee70fb95" (UID: "0b236b43-7ef1-4447-9182-2a37ee70fb95"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.677580 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0b236b43-7ef1-4447-9182-2a37ee70fb95-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.687238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b236b43-7ef1-4447-9182-2a37ee70fb95" (UID: "0b236b43-7ef1-4447-9182-2a37ee70fb95"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.779126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b236b43-7ef1-4447-9182-2a37ee70fb95-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.944365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerStarted","Data":"3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7"} Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.946074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0b236b43-7ef1-4447-9182-2a37ee70fb95","Type":"ContainerDied","Data":"fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd"} Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.946128 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc10576c94c994ab6a2d2cd723a2db57c1b36baa9540adf4ca4a9e2c7f87f4bd" Mar 13 14:00:07 crc kubenswrapper[4898]: I0313 14:00:07.946128 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 14:00:08 crc kubenswrapper[4898]: I0313 14:00:08.613785 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:08 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:08 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:08 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:08 crc kubenswrapper[4898]: I0313 14:00:08.613851 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.612115 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:09 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:09 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:09 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.612412 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.909891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.911785 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.927078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869-metrics-certs\") pod \"network-metrics-daemon-fwrwc\" (UID: \"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869\") " pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.958523 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 14:00:09 crc kubenswrapper[4898]: I0313 14:00:09.965815 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fwrwc" Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.104891 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-cx59b" Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.145747 4898 patch_prober.go:28] interesting pod/console-f9d7485db-7l2pm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.146121 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-7l2pm" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.613849 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:10 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:10 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:10 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:10 crc kubenswrapper[4898]: I0313 14:00:10.614190 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:11 crc kubenswrapper[4898]: I0313 14:00:11.628189 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:11 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:11 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:11 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:11 crc kubenswrapper[4898]: I0313 14:00:11.628527 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.195007 4898 ???:1] "http: TLS handshake error from 192.168.126.11:51012: no serving certificate available for the kubelet" Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.613559 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:12 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:12 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:12 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.613655 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.925739 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.926013 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" containerID="cri-o://b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a" gracePeriod=30 Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.946082 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 14:00:12 crc kubenswrapper[4898]: I0313 14:00:12.946316 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" containerID="cri-o://a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588" gracePeriod=30 Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.612592 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 14:00:13 crc kubenswrapper[4898]: [-]has-synced failed: reason withheld Mar 13 14:00:13 crc kubenswrapper[4898]: [+]process-running ok Mar 13 14:00:13 crc kubenswrapper[4898]: healthz check failed Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.612660 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.978510 4898 generic.go:334] "Generic (PLEG): container finished" podID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerID="a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588" exitCode=0 Mar 13 14:00:13 crc kubenswrapper[4898]: I0313 14:00:13.978577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerDied","Data":"a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588"} Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.438014 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.584581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") pod \"f52c1025-32e7-4eba-8af4-5c5cce1918da\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.585311 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") pod \"f52c1025-32e7-4eba-8af4-5c5cce1918da\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.585454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") pod \"f52c1025-32e7-4eba-8af4-5c5cce1918da\" (UID: \"f52c1025-32e7-4eba-8af4-5c5cce1918da\") " Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.586986 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume" (OuterVolumeSpecName: "config-volume") pod "f52c1025-32e7-4eba-8af4-5c5cce1918da" (UID: "f52c1025-32e7-4eba-8af4-5c5cce1918da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.592546 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f52c1025-32e7-4eba-8af4-5c5cce1918da" (UID: "f52c1025-32e7-4eba-8af4-5c5cce1918da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.593481 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q" (OuterVolumeSpecName: "kube-api-access-bqn5q") pod "f52c1025-32e7-4eba-8af4-5c5cce1918da" (UID: "f52c1025-32e7-4eba-8af4-5c5cce1918da"). InnerVolumeSpecName "kube-api-access-bqn5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.618422 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.621072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.687669 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f52c1025-32e7-4eba-8af4-5c5cce1918da-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.687703 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqn5q\" (UniqueName: \"kubernetes.io/projected/f52c1025-32e7-4eba-8af4-5c5cce1918da-kube-api-access-bqn5q\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:14 crc kubenswrapper[4898]: I0313 14:00:14.687716 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f52c1025-32e7-4eba-8af4-5c5cce1918da-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:14.996422 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" event={"ID":"f52c1025-32e7-4eba-8af4-5c5cce1918da","Type":"ContainerDied","Data":"423e6617bc8cc210d91629b1d5580f9b4f8c3137b80892ad74315408fb41680c"} Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:14.996480 4898 scope.go:117] "RemoveContainer" containerID="f1f6a6de92d72265f3f98ec19ac9f23972e5558307ce72451eab813ae76ff6a4" Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:14.996604 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8" Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.001283 4898 generic.go:334] "Generic (PLEG): container finished" podID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerID="b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a" exitCode=0 Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.001359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerDied","Data":"b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a"} Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.025170 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.031269 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556825-92fd8"] Mar 13 14:00:15 crc kubenswrapper[4898]: I0313 14:00:15.747738 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" path="/var/lib/kubelet/pods/f52c1025-32e7-4eba-8af4-5c5cce1918da/volumes" Mar 13 14:00:17 crc kubenswrapper[4898]: I0313 14:00:17.208945 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 14:00:17 crc kubenswrapper[4898]: I0313 14:00:17.546565 4898 patch_prober.go:28] interesting pod/route-controller-manager-779788b65f-vkvqq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 14:00:17 crc kubenswrapper[4898]: I0313 14:00:17.546636 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:00:18 crc kubenswrapper[4898]: I0313 14:00:18.092747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 14:00:19 crc kubenswrapper[4898]: I0313 14:00:19.134124 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:00:19 crc kubenswrapper[4898]: I0313 14:00:19.134216 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.080311 4898 patch_prober.go:28] interesting pod/controller-manager-6dc964fb55-scb8h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.080629 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.168722 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:20 crc kubenswrapper[4898]: I0313 14:00:20.172709 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.839951 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.846148 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.877440 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.877967 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerName="pruner" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878003 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerName="pruner" Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.878018 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878025 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.878036 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878041 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: E0313 14:00:26.878053 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878060 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878194 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b236b43-7ef1-4447-9182-2a37ee70fb95" containerName="pruner" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878204 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878243 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" containerName="controller-manager" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878253 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52c1025-32e7-4eba-8af4-5c5cce1918da" containerName="collect-profiles" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878712 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.878822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961379 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961402 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961475 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961495 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961550 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") pod \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\" (UID: \"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") pod \"01ab82b4-6104-416c-a69a-b942da8e5c21\" (UID: \"01ab82b4-6104-416c-a69a-b942da8e5c21\") " Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961773 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961950 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.961971 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca" (OuterVolumeSpecName: "client-ca") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963447 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca" (OuterVolumeSpecName: "client-ca") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963502 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config" (OuterVolumeSpecName: "config") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963542 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.963866 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config" (OuterVolumeSpecName: "config") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.967504 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm" (OuterVolumeSpecName: "kube-api-access-n4gbm") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "kube-api-access-n4gbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.967633 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.967800 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65" (OuterVolumeSpecName: "kube-api-access-p6t65") pod "01ab82b4-6104-416c-a69a-b942da8e5c21" (UID: "01ab82b4-6104-416c-a69a-b942da8e5c21"). InnerVolumeSpecName "kube-api-access-p6t65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:26 crc kubenswrapper[4898]: I0313 14:00:26.968349 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" (UID: "07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.064056 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.064460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.064650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.066322 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.066552 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.067198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.067486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.071686 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.071919 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072059 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6t65\" (UniqueName: \"kubernetes.io/projected/01ab82b4-6104-416c-a69a-b942da8e5c21-kube-api-access-p6t65\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072205 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072319 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4gbm\" (UniqueName: \"kubernetes.io/projected/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-kube-api-access-n4gbm\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072456 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072574 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab82b4-6104-416c-a69a-b942da8e5c21-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072699 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.072831 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/01ab82b4-6104-416c-a69a-b942da8e5c21-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.069723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.073394 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.073619 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dc964fb55-scb8h" event={"ID":"01ab82b4-6104-416c-a69a-b942da8e5c21","Type":"ContainerDied","Data":"bed30031d55363e7b3a1c1f5fd11eab1b82568e99697e58e8ebc0b77de6db828"} Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.075078 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" event={"ID":"07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd","Type":"ContainerDied","Data":"a9e681aa51d97234d007d99a849ac6425a4e895b1487edcb5c9a6fe14935144f"} Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.075257 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.083663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.092340 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"controller-manager-64446bcfb4-56ccg\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.124631 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.128433 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dc964fb55-scb8h"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.140665 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.143302 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq"] Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.199741 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.546589 4898 patch_prober.go:28] interesting pod/route-controller-manager-779788b65f-vkvqq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.546676 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-779788b65f-vkvqq" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.747270 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab82b4-6104-416c-a69a-b942da8e5c21" path="/var/lib/kubelet/pods/01ab82b4-6104-416c-a69a-b942da8e5c21/volumes" Mar 13 14:00:27 crc kubenswrapper[4898]: I0313 14:00:27.747870 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd" path="/var/lib/kubelet/pods/07d38b69-0d54-45ed-9c1c-bcf67b8ca5fd/volumes" Mar 13 14:00:27 crc kubenswrapper[4898]: E0313 14:00:27.794122 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 14:00:27 crc kubenswrapper[4898]: E0313 14:00:27.794360 4898 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 14:00:27 crc kubenswrapper[4898]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 14:00:27 crc kubenswrapper[4898]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c7p5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29556838-h7pkr_openshift-infra(aa1ed4c8-e4bd-4352-bee3-404f16244ea3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 14:00:27 crc kubenswrapper[4898]: > logger="UnhandledError" Mar 13 14:00:27 crc kubenswrapper[4898]: E0313 14:00:27.795537 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" Mar 13 14:00:28 crc kubenswrapper[4898]: E0313 14:00:28.081522 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" Mar 13 14:00:29 crc kubenswrapper[4898]: E0313 14:00:29.796148 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 14:00:29 crc kubenswrapper[4898]: E0313 14:00:29.796311 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5nn9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xh84s_openshift-marketplace(4ae77efc-55ca-4eee-8817-9c21d0bafa6e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:29 crc kubenswrapper[4898]: E0313 14:00:29.797530 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-xh84s" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" Mar 13 14:00:30 crc kubenswrapper[4898]: I0313 14:00:30.717457 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" Mar 13 14:00:31 crc kubenswrapper[4898]: E0313 14:00:31.736479 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xh84s" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.785526 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.796477 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.800555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.803270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.813094 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.813623 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.822262 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.822420 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.827564 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.828039 4898 scope.go:117] "RemoveContainer" containerID="b272539ffbdc9c2f782b3545e70e2aeb30b2bdc9248fb2dbe91338ac3f59b06a" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.851562 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.851863 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.852023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.852205 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.954007 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.954358 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.957816 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.957850 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.957692 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.958862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.964844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:31 crc kubenswrapper[4898]: I0313 14:00:31.974557 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"route-controller-manager-786d64999b-pd42k\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.005565 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.048124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.152375 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.220103 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.220271 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5x728,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-twh8h_openshift-marketplace(8f81bcfc-3c35-48e8-a584-961351e8c0e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.221830 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-twh8h" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.275414 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.554057 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.554246 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7vmd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ppq6v_openshift-marketplace(a990881e-0caf-4096-a372-4cdad69006c1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.555605 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ppq6v" podUID="a990881e-0caf-4096-a372-4cdad69006c1" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.739047 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.739204 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhlq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dvvz2_openshift-marketplace(43acaee8-efc8-4156-b28c-b493f241ac53): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:32 crc kubenswrapper[4898]: E0313 14:00:32.740407 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dvvz2" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" Mar 13 14:00:32 crc kubenswrapper[4898]: I0313 14:00:32.916738 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.017758 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.198957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.200068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.207818 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.278108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.278184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.379429 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.379508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.379558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.412132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:33 crc kubenswrapper[4898]: I0313 14:00:33.530988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:34 crc kubenswrapper[4898]: E0313 14:00:34.990149 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-twh8h" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" Mar 13 14:00:34 crc kubenswrapper[4898]: E0313 14:00:34.990174 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dvvz2" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" Mar 13 14:00:34 crc kubenswrapper[4898]: E0313 14:00:34.990385 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ppq6v" podUID="a990881e-0caf-4096-a372-4cdad69006c1" Mar 13 14:00:34 crc kubenswrapper[4898]: W0313 14:00:34.996734 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc222126e_abe0_43e6_95c8_cc6946c967ae.slice/crio-a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee WatchSource:0}: Error finding container a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee: Status 404 returned error can't find the container with id a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.002663 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7794a943_5fec_485e_86bf_f104ed6ae070.slice/crio-9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89 WatchSource:0}: Error finding container 9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89: Status 404 returned error can't find the container with id 9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.035098 4898 scope.go:117] "RemoveContainer" containerID="a636a0496339c0fb58170ff34b55715db6af8e4ac5ce6fd1db5551669c78c588" Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.116529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerStarted","Data":"9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89"} Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.127448 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerStarted","Data":"4cdc004944646e848df2358e59a867264f56b2a1a5573319599edf0e300fc922"} Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.134287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" event={"ID":"c222126e-abe0-43e6-95c8-cc6946c967ae","Type":"ContainerStarted","Data":"a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee"} Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.250128 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.258207 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod189d7154_fefa_48d1_b98f_5f86a30682b2.slice/crio-687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0 WatchSource:0}: Error finding container 687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0: Status 404 returned error can't find the container with id 687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.389014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fwrwc"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.393259 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d1f9d54_7cbb_4233_b3ee_b8d5dfa42869.slice/crio-80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1 WatchSource:0}: Error finding container 80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1: Status 404 returned error can't find the container with id 80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.517046 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.524012 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fa4a89_d754_4f84_80be_a552772613dc.slice/crio-752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835 WatchSource:0}: Error finding container 752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835: Status 404 returned error can't find the container with id 752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.551042 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:35 crc kubenswrapper[4898]: W0313 14:00:35.558877 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46150e0_fd12_4e99_8de9_82630b55487b.slice/crio-4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25 WatchSource:0}: Error finding container 4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25: Status 404 returned error can't find the container with id 4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25 Mar 13 14:00:35 crc kubenswrapper[4898]: I0313 14:00:35.628804 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.148201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" event={"ID":"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869","Type":"ContainerStarted","Data":"e4f9bc6b836d4b96018e048cdc85f5c276e4cfb89dda989403045d6c5ce31f83"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.148256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" event={"ID":"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869","Type":"ContainerStarted","Data":"80e9d51772a8debf3118899299439977e87466dd200b3a2b95ff04013f9825c1"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.149832 4898 generic.go:334] "Generic (PLEG): container finished" podID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerID="dac7072f1900557a02d6c49c5a63ec387e9ac4b9e0b548d071acd63216fda826" exitCode=0 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.149877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" event={"ID":"c222126e-abe0-43e6-95c8-cc6946c967ae","Type":"ContainerDied","Data":"dac7072f1900557a02d6c49c5a63ec387e9ac4b9e0b548d071acd63216fda826"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.153285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerStarted","Data":"b6436c5b6d8f152e7a64185eb0822e10182ef56f7e00b5c3cd7e12ec9274c737"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.153341 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerStarted","Data":"f600f28d004fed11e321fdd1224ecfbcb4d5947d8d946b747518b734300fe345"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.156583 4898 generic.go:334] "Generic (PLEG): container finished" podID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" exitCode=0 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.156952 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.164698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerStarted","Data":"e650d82e1f53899ecfc7c509cbbcdd64f6c94d3e168b7097ea78182782e3bef2"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.164754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerStarted","Data":"687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170080 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerStarted","Data":"f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170130 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerStarted","Data":"4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170250 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" containerID="cri-o://f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f" gracePeriod=30 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.170593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.182600 4898 generic.go:334] "Generic (PLEG): container finished" podID="7794a943-5fec-485e-86bf-f104ed6ae070" containerID="b05ab9f2e4156ffc544ae5bf8d297fc15f45604caf9f75b4b1a59e033d78a2fc" exitCode=0 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.182709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"b05ab9f2e4156ffc544ae5bf8d297fc15f45604caf9f75b4b1a59e033d78a2fc"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.185023 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=36.185009494 podStartE2EDuration="36.185009494s" podCreationTimestamp="2026-03-13 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:36.182944855 +0000 UTC m=+271.184533104" watchObservedRunningTime="2026-03-13 14:00:36.185009494 +0000 UTC m=+271.186597743" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.186388 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerStarted","Data":"5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.186427 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerStarted","Data":"752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835"} Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.186543 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" containerID="cri-o://5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a" gracePeriod=30 Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.187166 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.199912 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.252512 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" podStartSLOduration=24.252497334 podStartE2EDuration="24.252497334s" podCreationTimestamp="2026-03-13 14:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:36.24897334 +0000 UTC m=+271.250561609" watchObservedRunningTime="2026-03-13 14:00:36.252497334 +0000 UTC m=+271.254085573" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.275770 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" podStartSLOduration=24.275749129 podStartE2EDuration="24.275749129s" podCreationTimestamp="2026-03-13 14:00:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:36.275284058 +0000 UTC m=+271.276872337" watchObservedRunningTime="2026-03-13 14:00:36.275749129 +0000 UTC m=+271.277337368" Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.409103 4898 patch_prober.go:28] interesting pod/route-controller-manager-786d64999b-pd42k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:41770->10.217.0.61:8443: read: connection reset by peer" start-of-body= Mar 13 14:00:36 crc kubenswrapper[4898]: I0313 14:00:36.409152 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": read tcp 10.217.0.2:41770->10.217.0.61:8443: read: connection reset by peer" Mar 13 14:00:37 crc kubenswrapper[4898]: E0313 14:00:37.144842 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 14:00:37 crc kubenswrapper[4898]: E0313 14:00:37.145036 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22rm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-hn9sl_openshift-marketplace(b8bc0c30-71e1-41d2-8991-1ce9d85d50a1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 14:00:37 crc kubenswrapper[4898]: E0313 14:00:37.146643 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-hn9sl" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.192698 4898 generic.go:334] "Generic (PLEG): container finished" podID="a9fa4a89-d754-4f84-80be-a552772613dc" containerID="5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a" exitCode=0 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.192755 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerDied","Data":"5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.193970 4898 generic.go:334] "Generic (PLEG): container finished" podID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerID="e650d82e1f53899ecfc7c509cbbcdd64f6c94d3e168b7097ea78182782e3bef2" exitCode=0 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.194029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerDied","Data":"e650d82e1f53899ecfc7c509cbbcdd64f6c94d3e168b7097ea78182782e3bef2"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.196282 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-786d64999b-pd42k_c46150e0-fd12-4e99-8de9-82630b55487b/route-controller-manager/0.log" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.196410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerDied","Data":"f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.196389 4898 generic.go:334] "Generic (PLEG): container finished" podID="c46150e0-fd12-4e99-8de9-82630b55487b" containerID="f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f" exitCode=255 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.198942 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fwrwc" event={"ID":"9d1f9d54-7cbb-4233-b3ee-b8d5dfa42869","Type":"ContainerStarted","Data":"055dbff8a6f94d2747c73a8e2c33b0297f8dcaa3ef6f92b25fd67ee7af230e94"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200072 4898 generic.go:334] "Generic (PLEG): container finished" podID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerID="b6436c5b6d8f152e7a64185eb0822e10182ef56f7e00b5c3cd7e12ec9274c737" exitCode=0 Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200172 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerDied","Data":"b6436c5b6d8f152e7a64185eb0822e10182ef56f7e00b5c3cd7e12ec9274c737"} Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200368 4898 patch_prober.go:28] interesting pod/controller-manager-64446bcfb4-56ccg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.200447 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.795596 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.797049 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.811517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.836990 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.837129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.837176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.938876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939072 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.939390 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:37 crc kubenswrapper[4898]: I0313 14:00:37.969747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"installer-9-crc\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:38 crc kubenswrapper[4898]: I0313 14:00:38.130632 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:00:38 crc kubenswrapper[4898]: I0313 14:00:38.227391 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fwrwc" podStartSLOduration=207.22737001 podStartE2EDuration="3m27.22737001s" podCreationTimestamp="2026-03-13 13:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:38.224066711 +0000 UTC m=+273.225654950" watchObservedRunningTime="2026-03-13 14:00:38.22737001 +0000 UTC m=+273.228958249" Mar 13 14:00:39 crc kubenswrapper[4898]: E0313 14:00:39.107208 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-hn9sl" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.155148 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.161661 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.168236 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.220273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" event={"ID":"c222126e-abe0-43e6-95c8-cc6946c967ae","Type":"ContainerDied","Data":"a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee"} Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.220323 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a52e531bb6ba827a038762b2905cd91f8d32e0f4df6accde072de13447061bee" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.220379 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.225179 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"111e79bc-00ab-488b-8d9d-862ce8581fa9","Type":"ContainerDied","Data":"f600f28d004fed11e321fdd1224ecfbcb4d5947d8d946b747518b734300fe345"} Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.225220 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f600f28d004fed11e321fdd1224ecfbcb4d5947d8d946b747518b734300fe345" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.225231 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.226949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"189d7154-fefa-48d1-b98f-5f86a30682b2","Type":"ContainerDied","Data":"687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0"} Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.226967 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687b75fd7d1c7201c0a5fe3f1524aea3bd6ecd8ac0281603109401585ffb43a0" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.227016 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259822 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") pod \"c222126e-abe0-43e6-95c8-cc6946c967ae\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") pod \"189d7154-fefa-48d1-b98f-5f86a30682b2\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259939 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") pod \"111e79bc-00ab-488b-8d9d-862ce8581fa9\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.259989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") pod \"189d7154-fefa-48d1-b98f-5f86a30682b2\" (UID: \"189d7154-fefa-48d1-b98f-5f86a30682b2\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260033 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") pod \"111e79bc-00ab-488b-8d9d-862ce8581fa9\" (UID: \"111e79bc-00ab-488b-8d9d-862ce8581fa9\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") pod \"c222126e-abe0-43e6-95c8-cc6946c967ae\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260091 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") pod \"c222126e-abe0-43e6-95c8-cc6946c967ae\" (UID: \"c222126e-abe0-43e6-95c8-cc6946c967ae\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260142 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "111e79bc-00ab-488b-8d9d-862ce8581fa9" (UID: "111e79bc-00ab-488b-8d9d-862ce8581fa9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "189d7154-fefa-48d1-b98f-5f86a30682b2" (UID: "189d7154-fefa-48d1-b98f-5f86a30682b2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260469 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111e79bc-00ab-488b-8d9d-862ce8581fa9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260494 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/189d7154-fefa-48d1-b98f-5f86a30682b2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.260794 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "c222126e-abe0-43e6-95c8-cc6946c967ae" (UID: "c222126e-abe0-43e6-95c8-cc6946c967ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.265094 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "189d7154-fefa-48d1-b98f-5f86a30682b2" (UID: "189d7154-fefa-48d1-b98f-5f86a30682b2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.265873 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c222126e-abe0-43e6-95c8-cc6946c967ae" (UID: "c222126e-abe0-43e6-95c8-cc6946c967ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.267649 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "111e79bc-00ab-488b-8d9d-862ce8581fa9" (UID: "111e79bc-00ab-488b-8d9d-862ce8581fa9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.277087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf" (OuterVolumeSpecName: "kube-api-access-xtpqf") pod "c222126e-abe0-43e6-95c8-cc6946c967ae" (UID: "c222126e-abe0-43e6-95c8-cc6946c967ae"). InnerVolumeSpecName "kube-api-access-xtpqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362085 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtpqf\" (UniqueName: \"kubernetes.io/projected/c222126e-abe0-43e6-95c8-cc6946c967ae-kube-api-access-xtpqf\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362119 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c222126e-abe0-43e6-95c8-cc6946c967ae-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362128 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c222126e-abe0-43e6-95c8-cc6946c967ae-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362137 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/189d7154-fefa-48d1-b98f-5f86a30682b2-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.362146 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111e79bc-00ab-488b-8d9d-862ce8581fa9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.603163 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-786d64999b-pd42k_c46150e0-fd12-4e99-8de9-82630b55487b/route-controller-manager/0.log" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.603400 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.613455 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.650770 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665616 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665677 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665810 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665828 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") pod \"c46150e0-fd12-4e99-8de9-82630b55487b\" (UID: \"c46150e0-fd12-4e99-8de9-82630b55487b\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.665877 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") pod \"a9fa4a89-d754-4f84-80be-a552772613dc\" (UID: \"a9fa4a89-d754-4f84-80be-a552772613dc\") " Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.666800 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca" (OuterVolumeSpecName: "client-ca") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.666825 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config" (OuterVolumeSpecName: "config") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667278 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config" (OuterVolumeSpecName: "config") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667400 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667637 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667665 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c46150e0-fd12-4e99-8de9-82630b55487b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667682 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.667695 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.670620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.670764 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm" (OuterVolumeSpecName: "kube-api-access-9lszm") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "kube-api-access-9lszm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.671025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c46150e0-fd12-4e99-8de9-82630b55487b" (UID: "c46150e0-fd12-4e99-8de9-82630b55487b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.671422 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.672107 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg" (OuterVolumeSpecName: "kube-api-access-jk7gg") pod "a9fa4a89-d754-4f84-80be-a552772613dc" (UID: "a9fa4a89-d754-4f84-80be-a552772613dc"). InnerVolumeSpecName "kube-api-access-jk7gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768767 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a9fa4a89-d754-4f84-80be-a552772613dc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768790 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lszm\" (UniqueName: \"kubernetes.io/projected/c46150e0-fd12-4e99-8de9-82630b55487b-kube-api-access-9lszm\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768799 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46150e0-fd12-4e99-8de9-82630b55487b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768810 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk7gg\" (UniqueName: \"kubernetes.io/projected/a9fa4a89-d754-4f84-80be-a552772613dc-kube-api-access-jk7gg\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:39 crc kubenswrapper[4898]: I0313 14:00:39.768818 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9fa4a89-d754-4f84-80be-a552772613dc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.234487 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerStarted","Data":"dd7ab09c8cacba65b6094ca53d6f8610851fabd3f291b2f1e2fc1acdbaf4b50f"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240105 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-786d64999b-pd42k_c46150e0-fd12-4e99-8de9-82630b55487b/route-controller-manager/0.log" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240183 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" event={"ID":"c46150e0-fd12-4e99-8de9-82630b55487b","Type":"ContainerDied","Data":"4555d9ea560dda62fa3bb4e1189ec322eb21486dee3eb5d0e98385448c4e0a25"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240216 4898 scope.go:117] "RemoveContainer" containerID="f5ee2f06681290cd6019466610f90492ac1d27b61c3786e195c3276f8b9bc87f" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.240254 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.243229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" event={"ID":"a9fa4a89-d754-4f84-80be-a552772613dc","Type":"ContainerDied","Data":"752fe552b8c17ef86982e87196efe9daa90c3035620e4b9c932c71cb6e722835"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.243356 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64446bcfb4-56ccg" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.249353 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" exitCode=0 Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.249423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d"} Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.283732 4898 scope.go:117] "RemoveContainer" containerID="5450739f432e17c50d2eee20e629b8170cfc52fa713a03177856d3eacd247c1a" Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.286275 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.289259 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-786d64999b-pd42k"] Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.301535 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:40 crc kubenswrapper[4898]: I0313 14:00:40.305068 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64446bcfb4-56ccg"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.259394 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerStarted","Data":"e6def40c01eb99d1e2a262735e66a81eae49e5b4d7cd72298170f434700dfefb"} Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.286219 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=4.286194427 podStartE2EDuration="4.286194427s" podCreationTimestamp="2026-03-13 14:00:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:41.284607809 +0000 UTC m=+276.286196068" watchObservedRunningTime="2026-03-13 14:00:41.286194427 +0000 UTC m=+276.287782666" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.750687 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" path="/var/lib/kubelet/pods/a9fa4a89-d754-4f84-80be-a552772613dc/volumes" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.751866 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" path="/var/lib/kubelet/pods/c46150e0-fd12-4e99-8de9-82630b55487b/volumes" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.790747 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793284 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793310 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793324 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793329 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793344 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793355 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerName="collect-profiles" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793361 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerName="collect-profiles" Mar 13 14:00:41 crc kubenswrapper[4898]: E0313 14:00:41.793371 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793378 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793470 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="111e79bc-00ab-488b-8d9d-862ce8581fa9" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793479 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9fa4a89-d754-4f84-80be-a552772613dc" containerName="controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793490 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46150e0-fd12-4e99-8de9-82630b55487b" containerName="route-controller-manager" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793502 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="189d7154-fefa-48d1-b98f-5f86a30682b2" containerName="pruner" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793510 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" containerName="collect-profiles" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.793859 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.794350 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.795196 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.798314 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.798491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799215 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799322 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799435 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799709 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799768 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799823 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799941 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.800037 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.800063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.799855 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.803932 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.805442 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.806657 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898350 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898383 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898418 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898648 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:41 crc kubenswrapper[4898]: I0313 14:00:41.898770 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000445 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000499 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000517 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.000605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002376 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002601 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.002957 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.003512 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.009583 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.015795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.025256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"route-controller-manager-7bcbd84b47-vdd2b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.033930 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"controller-manager-67c9866bb9-9276f\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.164522 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.168886 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.289243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerStarted","Data":"ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a"} Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.295547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerStarted","Data":"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8"} Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.304719 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" podStartSLOduration=35.715431971 podStartE2EDuration="42.304705776s" podCreationTimestamp="2026-03-13 14:00:00 +0000 UTC" firstStartedPulling="2026-03-13 14:00:35.039014403 +0000 UTC m=+270.040602652" lastFinishedPulling="2026-03-13 14:00:41.628288198 +0000 UTC m=+276.629876457" observedRunningTime="2026-03-13 14:00:42.301872468 +0000 UTC m=+277.303460707" watchObservedRunningTime="2026-03-13 14:00:42.304705776 +0000 UTC m=+277.306294015" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.328068 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h97c9" podStartSLOduration=10.185673494 podStartE2EDuration="44.328052703s" podCreationTimestamp="2026-03-13 13:59:58 +0000 UTC" firstStartedPulling="2026-03-13 14:00:07.48136777 +0000 UTC m=+242.482956019" lastFinishedPulling="2026-03-13 14:00:41.623746969 +0000 UTC m=+276.625335228" observedRunningTime="2026-03-13 14:00:42.326932816 +0000 UTC m=+277.328521075" watchObservedRunningTime="2026-03-13 14:00:42.328052703 +0000 UTC m=+277.329640942" Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.374979 4898 csr.go:261] certificate signing request csr-ps7l6 is approved, waiting to be issued Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.378469 4898 csr.go:257] certificate signing request csr-ps7l6 is issued Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.603021 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:00:42 crc kubenswrapper[4898]: W0313 14:00:42.611222 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48b4d86a_9b94_4913_a35f_fd5e449ca40b.slice/crio-13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f WatchSource:0}: Error finding container 13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f: Status 404 returned error can't find the container with id 13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f Mar 13 14:00:42 crc kubenswrapper[4898]: I0313 14:00:42.671018 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.302499 4898 generic.go:334] "Generic (PLEG): container finished" podID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerID="ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a" exitCode=0 Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.302597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerDied","Data":"ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.304869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerStarted","Data":"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.304915 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerStarted","Data":"13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.305112 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.306578 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerStarted","Data":"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.306633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerStarted","Data":"586304a4e2dd4eea4580413987b917096a6c4f3326c8482af26767ff95bd2378"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.306653 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.307776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerStarted","Data":"f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60"} Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.339073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.343519 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.351000 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" podStartSLOduration=112.810475586 podStartE2EDuration="2m43.350991238s" podCreationTimestamp="2026-03-13 13:58:00 +0000 UTC" firstStartedPulling="2026-03-13 13:59:52.149855785 +0000 UTC m=+227.151444024" lastFinishedPulling="2026-03-13 14:00:42.690371437 +0000 UTC m=+277.691959676" observedRunningTime="2026-03-13 14:00:43.347859334 +0000 UTC m=+278.349447573" watchObservedRunningTime="2026-03-13 14:00:43.350991238 +0000 UTC m=+278.352579477" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.371267 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" podStartSLOduration=10.371250132 podStartE2EDuration="10.371250132s" podCreationTimestamp="2026-03-13 14:00:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:43.367313198 +0000 UTC m=+278.368901457" watchObservedRunningTime="2026-03-13 14:00:43.371250132 +0000 UTC m=+278.372838371" Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.380663 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-27 20:25:45.788217123 +0000 UTC Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.380704 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6942h25m2.407515676s for next certificate rotation Mar 13 14:00:43 crc kubenswrapper[4898]: I0313 14:00:43.390784 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" podStartSLOduration=11.390769297 podStartE2EDuration="11.390769297s" podCreationTimestamp="2026-03-13 14:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:00:43.3887817 +0000 UTC m=+278.390369949" watchObservedRunningTime="2026-03-13 14:00:43.390769297 +0000 UTC m=+278.392357536" Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.318086 4898 generic.go:334] "Generic (PLEG): container finished" podID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerID="f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60" exitCode=0 Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.318186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerDied","Data":"f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60"} Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.380965 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-27 14:23:21.897099379 +0000 UTC Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.381014 4898 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6216h22m37.516088167s for next certificate rotation Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.600979 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.635169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") pod \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\" (UID: \"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce\") " Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.643268 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz" (OuterVolumeSpecName: "kube-api-access-dlbbz") pod "4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" (UID: "4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce"). InnerVolumeSpecName "kube-api-access-dlbbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:44 crc kubenswrapper[4898]: I0313 14:00:44.741392 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlbbz\" (UniqueName: \"kubernetes.io/projected/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce-kube-api-access-dlbbz\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.340601 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.340669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556840-vmqqn" event={"ID":"4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce","Type":"ContainerDied","Data":"4cdc004944646e848df2358e59a867264f56b2a1a5573319599edf0e300fc922"} Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.340719 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cdc004944646e848df2358e59a867264f56b2a1a5573319599edf0e300fc922" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.628062 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.755761 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") pod \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\" (UID: \"aa1ed4c8-e4bd-4352-bee3-404f16244ea3\") " Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.765885 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d" (OuterVolumeSpecName: "kube-api-access-c7p5d") pod "aa1ed4c8-e4bd-4352-bee3-404f16244ea3" (UID: "aa1ed4c8-e4bd-4352-bee3-404f16244ea3"). InnerVolumeSpecName "kube-api-access-c7p5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:00:45 crc kubenswrapper[4898]: I0313 14:00:45.857678 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7p5d\" (UniqueName: \"kubernetes.io/projected/aa1ed4c8-e4bd-4352-bee3-404f16244ea3-kube-api-access-c7p5d\") on node \"crc\" DevicePath \"\"" Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.347428 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" event={"ID":"aa1ed4c8-e4bd-4352-bee3-404f16244ea3","Type":"ContainerDied","Data":"a7c95316f7425af660b27292d15441df4c43eb94ff232136664abdd1d5a272eb"} Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.347986 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c95316f7425af660b27292d15441df4c43eb94ff232136664abdd1d5a272eb" Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.347622 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556838-h7pkr" Mar 13 14:00:46 crc kubenswrapper[4898]: I0313 14:00:46.349925 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerStarted","Data":"804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a"} Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.368620 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerID="804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a" exitCode=0 Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.368690 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a"} Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.368740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerStarted","Data":"b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87"} Mar 13 14:00:47 crc kubenswrapper[4898]: I0313 14:00:47.392106 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xh84s" podStartSLOduration=2.335606893 podStartE2EDuration="51.392070569s" podCreationTimestamp="2026-03-13 13:59:56 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.712988819 +0000 UTC m=+232.714577048" lastFinishedPulling="2026-03-13 14:00:46.769452485 +0000 UTC m=+281.771040724" observedRunningTime="2026-03-13 14:00:47.388279058 +0000 UTC m=+282.389867317" watchObservedRunningTime="2026-03-13 14:00:47.392070569 +0000 UTC m=+282.393658808" Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.384166 4898 generic.go:334] "Generic (PLEG): container finished" podID="43acaee8-efc8-4156-b28c-b493f241ac53" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" exitCode=0 Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.384245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b"} Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.426860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.426954 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:48 crc kubenswrapper[4898]: I0313 14:00:48.675688 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.135120 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.135589 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.135664 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.138201 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.138310 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56" gracePeriod=600 Mar 13 14:00:49 crc kubenswrapper[4898]: I0313 14:00:49.425352 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:00:50 crc kubenswrapper[4898]: I0313 14:00:50.396095 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56" exitCode=0 Mar 13 14:00:50 crc kubenswrapper[4898]: I0313 14:00:50.396170 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56"} Mar 13 14:00:54 crc kubenswrapper[4898]: I0313 14:00:54.419157 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.435494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerStarted","Data":"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.441383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerStarted","Data":"67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.452942 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerStarted","Data":"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec"} Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.478762 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dvvz2" podStartSLOduration=2.426755128 podStartE2EDuration="1m0.478736125s" podCreationTimestamp="2026-03-13 13:59:56 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.712081718 +0000 UTC m=+232.713669957" lastFinishedPulling="2026-03-13 14:00:55.764062715 +0000 UTC m=+290.765650954" observedRunningTime="2026-03-13 14:00:56.473200693 +0000 UTC m=+291.474788952" watchObservedRunningTime="2026-03-13 14:00:56.478736125 +0000 UTC m=+291.480324364" Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.804492 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.804563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:56 crc kubenswrapper[4898]: I0313 14:00:56.838273 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.459995 4898 generic.go:334] "Generic (PLEG): container finished" podID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.460046 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.462631 4898 generic.go:334] "Generic (PLEG): container finished" podID="7794a943-5fec-485e-86bf-f104ed6ae070" containerID="67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.462681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.466189 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.466253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.469659 4898 generic.go:334] "Generic (PLEG): container finished" podID="a990881e-0caf-4096-a372-4cdad69006c1" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.469693 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.472741 4898 generic.go:334] "Generic (PLEG): container finished" podID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" exitCode=0 Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.472993 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73"} Mar 13 14:00:57 crc kubenswrapper[4898]: I0313 14:00:57.530413 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:00:57 crc kubenswrapper[4898]: E0313 14:00:57.545557 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183d86e9_cd5c_45ed_a460_bb6169e07c72.slice/crio-conmon-4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda990881e_0caf_4096_a372_4cdad69006c1.slice/crio-conmon-4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:00:58 crc kubenswrapper[4898]: I0313 14:00:58.293097 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 14:00:58 crc kubenswrapper[4898]: I0313 14:00:58.812458 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q"] Mar 13 14:00:59 crc kubenswrapper[4898]: I0313 14:00:59.483089 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xh84s" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" containerID="cri-o://b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87" gracePeriod=2 Mar 13 14:01:00 crc kubenswrapper[4898]: I0313 14:01:00.490978 4898 generic.go:334] "Generic (PLEG): container finished" podID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerID="b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87" exitCode=0 Mar 13 14:01:00 crc kubenswrapper[4898]: I0313 14:01:00.491036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87"} Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.752366 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.882512 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") pod \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.882615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") pod \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.882678 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") pod \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\" (UID: \"4ae77efc-55ca-4eee-8817-9c21d0bafa6e\") " Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.884285 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities" (OuterVolumeSpecName: "utilities") pod "4ae77efc-55ca-4eee-8817-9c21d0bafa6e" (UID: "4ae77efc-55ca-4eee-8817-9c21d0bafa6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.888274 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n" (OuterVolumeSpecName: "kube-api-access-5nn9n") pod "4ae77efc-55ca-4eee-8817-9c21d0bafa6e" (UID: "4ae77efc-55ca-4eee-8817-9c21d0bafa6e"). InnerVolumeSpecName "kube-api-access-5nn9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.941113 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae77efc-55ca-4eee-8817-9c21d0bafa6e" (UID: "4ae77efc-55ca-4eee-8817-9c21d0bafa6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.984111 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.984154 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nn9n\" (UniqueName: \"kubernetes.io/projected/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-kube-api-access-5nn9n\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:01 crc kubenswrapper[4898]: I0313 14:01:01.984171 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae77efc-55ca-4eee-8817-9c21d0bafa6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.505126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xh84s" event={"ID":"4ae77efc-55ca-4eee-8817-9c21d0bafa6e","Type":"ContainerDied","Data":"45fc69d27eaeb1e52f659215a5860c090893736d0fa5aca134749f73422aadc9"} Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.505180 4898 scope.go:117] "RemoveContainer" containerID="b5e3c3eb492b54ad0a87f08620309d02bb7883094244c1b703d4a244c8429e87" Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.505275 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xh84s" Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.508547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerStarted","Data":"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635"} Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.534023 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.538797 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xh84s"] Mar 13 14:01:02 crc kubenswrapper[4898]: I0313 14:01:02.885263 4898 scope.go:117] "RemoveContainer" containerID="804bb12e6a7adbbf4efa1f3ec85b57e7a62babe514eec20e4355f14bea3bdf4a" Mar 13 14:01:03 crc kubenswrapper[4898]: I0313 14:01:03.084060 4898 scope.go:117] "RemoveContainer" containerID="359074a54fdd2abe01e5471c8009872f5ca05eb132b157ad005435e3bc55c0f9" Mar 13 14:01:03 crc kubenswrapper[4898]: I0313 14:01:03.531331 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-974qp" podStartSLOduration=42.014150816 podStartE2EDuration="1m4.531315054s" podCreationTimestamp="2026-03-13 13:59:59 +0000 UTC" firstStartedPulling="2026-03-13 14:00:39.103100053 +0000 UTC m=+274.104688292" lastFinishedPulling="2026-03-13 14:01:01.620264291 +0000 UTC m=+296.621852530" observedRunningTime="2026-03-13 14:01:03.52948117 +0000 UTC m=+298.531069409" watchObservedRunningTime="2026-03-13 14:01:03.531315054 +0000 UTC m=+298.532903293" Mar 13 14:01:03 crc kubenswrapper[4898]: I0313 14:01:03.746494 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" path="/var/lib/kubelet/pods/4ae77efc-55ca-4eee-8817-9c21d0bafa6e/volumes" Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.521886 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerStarted","Data":"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.526185 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerStarted","Data":"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.528339 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerStarted","Data":"72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.530329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerStarted","Data":"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c"} Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.544948 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ppq6v" podStartSLOduration=2.181939927 podStartE2EDuration="1m8.544929887s" podCreationTimestamp="2026-03-13 13:59:56 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.718445189 +0000 UTC m=+232.720033428" lastFinishedPulling="2026-03-13 14:01:04.081435149 +0000 UTC m=+299.083023388" observedRunningTime="2026-03-13 14:01:04.541626728 +0000 UTC m=+299.543214987" watchObservedRunningTime="2026-03-13 14:01:04.544929887 +0000 UTC m=+299.546518136" Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.556974 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-btkxt" podStartSLOduration=40.991937029 podStartE2EDuration="1m5.556958474s" podCreationTimestamp="2026-03-13 13:59:59 +0000 UTC" firstStartedPulling="2026-03-13 14:00:39.103140254 +0000 UTC m=+274.104728523" lastFinishedPulling="2026-03-13 14:01:03.668161729 +0000 UTC m=+298.669749968" observedRunningTime="2026-03-13 14:01:04.555323054 +0000 UTC m=+299.556911293" watchObservedRunningTime="2026-03-13 14:01:04.556958474 +0000 UTC m=+299.558546713" Mar 13 14:01:04 crc kubenswrapper[4898]: I0313 14:01:04.582205 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twh8h" podStartSLOduration=3.712099006 podStartE2EDuration="1m9.582185555s" podCreationTimestamp="2026-03-13 13:59:55 +0000 UTC" firstStartedPulling="2026-03-13 13:59:57.705931321 +0000 UTC m=+232.707519560" lastFinishedPulling="2026-03-13 14:01:03.57601787 +0000 UTC m=+298.577606109" observedRunningTime="2026-03-13 14:01:04.580959376 +0000 UTC m=+299.582547635" watchObservedRunningTime="2026-03-13 14:01:04.582185555 +0000 UTC m=+299.583773784" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.236960 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.237032 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.284343 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.310647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hn9sl" podStartSLOduration=4.545949332 podStartE2EDuration="1m8.310625312s" podCreationTimestamp="2026-03-13 13:59:58 +0000 UTC" firstStartedPulling="2026-03-13 13:59:59.783067736 +0000 UTC m=+234.784655975" lastFinishedPulling="2026-03-13 14:01:03.547743716 +0000 UTC m=+298.549331955" observedRunningTime="2026-03-13 14:01:04.613515113 +0000 UTC m=+299.615103372" watchObservedRunningTime="2026-03-13 14:01:06.310625312 +0000 UTC m=+301.312213551" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.441804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.442545 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.489705 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.615086 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.615144 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.616402 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:01:06 crc kubenswrapper[4898]: I0313 14:01:06.663617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:08 crc kubenswrapper[4898]: I0313 14:01:08.838974 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:08 crc kubenswrapper[4898]: I0313 14:01:08.839059 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:08 crc kubenswrapper[4898]: I0313 14:01:08.904505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.408024 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.408105 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.620479 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.813512 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:09 crc kubenswrapper[4898]: I0313 14:01:09.813985 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:10 crc kubenswrapper[4898]: I0313 14:01:10.465159 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-974qp" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" probeResult="failure" output=< Mar 13 14:01:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:01:10 crc kubenswrapper[4898]: > Mar 13 14:01:10 crc kubenswrapper[4898]: I0313 14:01:10.866978 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-btkxt" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" probeResult="failure" output=< Mar 13 14:01:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:01:10 crc kubenswrapper[4898]: > Mar 13 14:01:11 crc kubenswrapper[4898]: I0313 14:01:11.896391 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 14:01:11 crc kubenswrapper[4898]: I0313 14:01:11.898722 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hn9sl" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" containerID="cri-o://34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" gracePeriod=2 Mar 13 14:01:12 crc kubenswrapper[4898]: I0313 14:01:12.925306 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:01:12 crc kubenswrapper[4898]: I0313 14:01:12.925693 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" containerID="cri-o://d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" gracePeriod=30 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.020841 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.021348 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" containerID="cri-o://22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" gracePeriod=30 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.433178 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.476065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.518531 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561094 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561150 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561177 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561263 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561374 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") pod \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561398 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561420 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") pod \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\" (UID: \"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.561443 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") pod \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\" (UID: \"48b4d86a-9b94-4913-a35f-fd5e449ca40b\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.562261 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca" (OuterVolumeSpecName: "client-ca") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.562895 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config" (OuterVolumeSpecName: "config") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.563041 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.563240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config" (OuterVolumeSpecName: "config") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.563713 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566573 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566600 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566599 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp" (OuterVolumeSpecName: "kube-api-access-gsdjp") pod "48b4d86a-9b94-4913-a35f-fd5e449ca40b" (UID: "48b4d86a-9b94-4913-a35f-fd5e449ca40b"). InnerVolumeSpecName "kube-api-access-gsdjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566836 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7" (OuterVolumeSpecName: "kube-api-access-22rm7") pod "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" (UID: "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1"). InnerVolumeSpecName "kube-api-access-22rm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.566915 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn" (OuterVolumeSpecName: "kube-api-access-mnwjn") pod "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" (UID: "d57ce87c-0f10-48ca-b5ee-09b139d9ebd0"). InnerVolumeSpecName "kube-api-access-mnwjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590767 4898 generic.go:334] "Generic (PLEG): container finished" podID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" exitCode=0 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590801 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590831 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerDied","Data":"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590854 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b" event={"ID":"48b4d86a-9b94-4913-a35f-fd5e449ca40b","Type":"ContainerDied","Data":"13758db88f06f5e481700996753a013ebf1c45a10dc6dbacc0a63821fd99241f"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.590870 4898 scope.go:117] "RemoveContainer" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594237 4898 generic.go:334] "Generic (PLEG): container finished" podID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" exitCode=0 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594276 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hn9sl" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.594323 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hn9sl" event={"ID":"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1","Type":"ContainerDied","Data":"0b8d238e1855df1df599d5c20b2f8c47368ca041ea02bd9d799ff8595124e451"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595476 4898 generic.go:334] "Generic (PLEG): container finished" podID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" exitCode=0 Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerDied","Data":"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" event={"ID":"d57ce87c-0f10-48ca-b5ee-09b139d9ebd0","Type":"ContainerDied","Data":"586304a4e2dd4eea4580413987b917096a6c4f3326c8482af26767ff95bd2378"} Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.595662 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9866bb9-9276f" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.607401 4898 scope.go:117] "RemoveContainer" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.608130 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01\": container with ID starting with 22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01 not found: ID does not exist" containerID="22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.608176 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01"} err="failed to get container status \"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01\": rpc error: code = NotFound desc = could not find container \"22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01\": container with ID starting with 22bb704ae349a00bd3ff54a939a69467adddc106eb993e03597abddaa460ee01 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.608205 4898 scope.go:117] "RemoveContainer" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.621347 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.625707 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bcbd84b47-vdd2b"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.626462 4898 scope.go:117] "RemoveContainer" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.629740 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.632992 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c9866bb9-9276f"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.638912 4898 scope.go:117] "RemoveContainer" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.650672 4898 scope.go:117] "RemoveContainer" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.651009 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b\": container with ID starting with 34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b not found: ID does not exist" containerID="34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651046 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b"} err="failed to get container status \"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b\": rpc error: code = NotFound desc = could not find container \"34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b\": container with ID starting with 34eef6e95bd6c6e5a38318f6b3a75e9c8807c801636ab61d5466b4cf0037730b not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651078 4898 scope.go:117] "RemoveContainer" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.651339 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73\": container with ID starting with aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73 not found: ID does not exist" containerID="aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651370 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73"} err="failed to get container status \"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73\": rpc error: code = NotFound desc = could not find container \"aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73\": container with ID starting with aa63312252f658ceca7c77dd0dfed144856961d5306b45331a156f811c5eef73 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651388 4898 scope.go:117] "RemoveContainer" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.651829 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1\": container with ID starting with 2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1 not found: ID does not exist" containerID="2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651863 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1"} err="failed to get container status \"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1\": rpc error: code = NotFound desc = could not find container \"2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1\": container with ID starting with 2528aa8f68ede75859cc4272c7396a63edf906831259593563c01d94a61ac7d1 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.651886 4898 scope.go:117] "RemoveContainer" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662204 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") pod \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662321 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") pod \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\" (UID: \"b8bc0c30-71e1-41d2-8991-1ce9d85d50a1\") " Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662510 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662528 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnwjn\" (UniqueName: \"kubernetes.io/projected/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-kube-api-access-mnwjn\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662537 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsdjp\" (UniqueName: \"kubernetes.io/projected/48b4d86a-9b94-4913-a35f-fd5e449ca40b-kube-api-access-gsdjp\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662547 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22rm7\" (UniqueName: \"kubernetes.io/projected/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-kube-api-access-22rm7\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662556 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662563 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662571 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662581 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4d86a-9b94-4913-a35f-fd5e449ca40b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662589 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4d86a-9b94-4913-a35f-fd5e449ca40b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.662596 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.663314 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities" (OuterVolumeSpecName: "utilities") pod "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" (UID: "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.663366 4898 scope.go:117] "RemoveContainer" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" Mar 13 14:01:13 crc kubenswrapper[4898]: E0313 14:01:13.664032 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7\": container with ID starting with d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7 not found: ID does not exist" containerID="d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.664071 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7"} err="failed to get container status \"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7\": rpc error: code = NotFound desc = could not find container \"d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7\": container with ID starting with d5ad3dcf9d418253bf096607969322ddb59d415decdc8a397e768a53284c42b7 not found: ID does not exist" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.684393 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" (UID: "b8bc0c30-71e1-41d2-8991-1ce9d85d50a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.749944 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" path="/var/lib/kubelet/pods/48b4d86a-9b94-4913-a35f-fd5e449ca40b/volumes" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.751251 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" path="/var/lib/kubelet/pods/d57ce87c-0f10-48ca-b5ee-09b139d9ebd0/volumes" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.764536 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.764599 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.920145 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 14:01:13 crc kubenswrapper[4898]: I0313 14:01:13.927809 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hn9sl"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.830269 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831287 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831301 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831320 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831327 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831345 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831356 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831362 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831370 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831376 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831383 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831388 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831397 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831403 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="extract-utilities" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831410 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831416 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="extract-content" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831430 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831436 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: E0313 14:01:14.831444 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831450 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831542 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831555 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae77efc-55ca-4eee-8817-9c21d0bafa6e" containerName="registry-server" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831566 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d57ce87c-0f10-48ca-b5ee-09b139d9ebd0" containerName="controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831574 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b4d86a-9b94-4913-a35f-fd5e449ca40b" containerName="route-controller-manager" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831582 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.831590 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" containerName="oc" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.832045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.834046 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.834649 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.835212 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.836357 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.836542 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.837749 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838069 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838316 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838479 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.838639 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.839435 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.841269 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.841505 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.843026 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.844418 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.844695 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.845063 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.881768 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.881853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.881981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882012 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882095 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882121 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.882138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983461 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983532 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.983601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.984687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.985074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.985104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.986073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.989295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.991659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.996919 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:14 crc kubenswrapper[4898]: I0313 14:01:14.999784 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"controller-manager-db7fb4ff9-qlmhf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.017007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"route-controller-manager-76556fcf77-lhw2w\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.201828 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.218254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.466859 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:01:15 crc kubenswrapper[4898]: W0313 14:01:15.473069 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a72588b_5ed5_4ab7_bbe0_c0b6e08eedbf.slice/crio-ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9 WatchSource:0}: Error finding container ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9: Status 404 returned error can't find the container with id ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9 Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.509311 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.616346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerStarted","Data":"3f35c6846af9efc90831b47db53412c7613bf81bc4a1bb77055f1c9c4645cef8"} Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.618743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerStarted","Data":"ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9"} Mar 13 14:01:15 crc kubenswrapper[4898]: I0313 14:01:15.749411 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bc0c30-71e1-41d2-8991-1ce9d85d50a1" path="/var/lib/kubelet/pods/b8bc0c30-71e1-41d2-8991-1ce9d85d50a1/volumes" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.286564 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.629711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerStarted","Data":"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97"} Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.630353 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.631383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerStarted","Data":"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989"} Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.631672 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.639213 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.640683 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.655553 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" podStartSLOduration=3.655529929 podStartE2EDuration="3.655529929s" podCreationTimestamp="2026-03-13 14:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:01:16.652672141 +0000 UTC m=+311.654260390" watchObservedRunningTime="2026-03-13 14:01:16.655529929 +0000 UTC m=+311.657118178" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.668525 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:16 crc kubenswrapper[4898]: I0313 14:01:16.704093 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" podStartSLOduration=4.704073357 podStartE2EDuration="4.704073357s" podCreationTimestamp="2026-03-13 14:01:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:01:16.703292698 +0000 UTC m=+311.704880957" watchObservedRunningTime="2026-03-13 14:01:16.704073357 +0000 UTC m=+311.705661616" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.863293 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864336 4898 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864581 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864691 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864700 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864675 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.864724 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" gracePeriod=15 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865727 4898 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.865878 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865891 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.865941 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865974 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.865986 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.865994 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.866004 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867729 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867789 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867797 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867820 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867827 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867838 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867845 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.867853 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867859 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867972 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867984 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.867994 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868001 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868007 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868015 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868022 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.868110 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868120 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: E0313 14:01:17.868133 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868143 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868224 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.868235 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.891552 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ppq6v"] Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.892016 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ppq6v" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" containerID="cri-o://34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" gracePeriod=2 Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.898991 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924397 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924418 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924476 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924526 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924552 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924587 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:17 crc kubenswrapper[4898]: I0313 14:01:17.924620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026426 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026471 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026489 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026503 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.026711 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.320865 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.434738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") pod \"a990881e-0caf-4096-a372-4cdad69006c1\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.435058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") pod \"a990881e-0caf-4096-a372-4cdad69006c1\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.435134 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") pod \"a990881e-0caf-4096-a372-4cdad69006c1\" (UID: \"a990881e-0caf-4096-a372-4cdad69006c1\") " Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.435570 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities" (OuterVolumeSpecName: "utilities") pod "a990881e-0caf-4096-a372-4cdad69006c1" (UID: "a990881e-0caf-4096-a372-4cdad69006c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.445769 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd" (OuterVolumeSpecName: "kube-api-access-x7vmd") pod "a990881e-0caf-4096-a372-4cdad69006c1" (UID: "a990881e-0caf-4096-a372-4cdad69006c1"). InnerVolumeSpecName "kube-api-access-x7vmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.497429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a990881e-0caf-4096-a372-4cdad69006c1" (UID: "a990881e-0caf-4096-a372-4cdad69006c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.537041 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.537286 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7vmd\" (UniqueName: \"kubernetes.io/projected/a990881e-0caf-4096-a372-4cdad69006c1-kube-api-access-x7vmd\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.537376 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a990881e-0caf-4096-a372-4cdad69006c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.644051 4898 generic.go:334] "Generic (PLEG): container finished" podID="77480be5-9488-434e-8105-0fc9237cae46" containerID="e6def40c01eb99d1e2a262735e66a81eae49e5b4d7cd72298170f434700dfefb" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.644168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerDied","Data":"e6def40c01eb99d1e2a262735e66a81eae49e5b4d7cd72298170f434700dfefb"} Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.647168 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.648803 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.649822 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.649926 4898 scope.go:117] "RemoveContainer" containerID="a040ae8921298d7b8331367fe146b46d942b7564e3952ee6a0d2e2befd7a1ec3" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.649945 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.650113 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.650217 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" exitCode=2 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653631 4898 generic.go:334] "Generic (PLEG): container finished" podID="a990881e-0caf-4096-a372-4cdad69006c1" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" exitCode=0 Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc"} Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ppq6v" event={"ID":"a990881e-0caf-4096-a372-4cdad69006c1","Type":"ContainerDied","Data":"3acac09dab7fc6e01d8b6bf7a368fc3881544da372e2f3a95826c1fc007510c2"} Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.653865 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ppq6v" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.705486 4898 scope.go:117] "RemoveContainer" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.727770 4898 scope.go:117] "RemoveContainer" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.745674 4898 scope.go:117] "RemoveContainer" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.766181 4898 scope.go:117] "RemoveContainer" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" Mar 13 14:01:18 crc kubenswrapper[4898]: E0313 14:01:18.766828 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc\": container with ID starting with 34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc not found: ID does not exist" containerID="34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.766876 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc"} err="failed to get container status \"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc\": rpc error: code = NotFound desc = could not find container \"34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc\": container with ID starting with 34f99ef9389fe4ac9ebfa6a1f58e54f8c62e5652260b00330822919d355fffdc not found: ID does not exist" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.766964 4898 scope.go:117] "RemoveContainer" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" Mar 13 14:01:18 crc kubenswrapper[4898]: E0313 14:01:18.767423 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e\": container with ID starting with 4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e not found: ID does not exist" containerID="4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.767506 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e"} err="failed to get container status \"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e\": rpc error: code = NotFound desc = could not find container \"4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e\": container with ID starting with 4c76c67750c09f06873437edbd5c079a177fa11ace696be58cb2d354d275db9e not found: ID does not exist" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.767555 4898 scope.go:117] "RemoveContainer" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" Mar 13 14:01:18 crc kubenswrapper[4898]: E0313 14:01:18.768018 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263\": container with ID starting with 3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263 not found: ID does not exist" containerID="3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263" Mar 13 14:01:18 crc kubenswrapper[4898]: I0313 14:01:18.768052 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263"} err="failed to get container status \"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263\": rpc error: code = NotFound desc = could not find container \"3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263\": container with ID starting with 3f6d76254e697191c7e800bb760967dc1adfad9f4667e33eaf85c00c3d7a9263 not found: ID does not exist" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.451882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.505274 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.663846 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.860040 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.903036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:01:19 crc kubenswrapper[4898]: I0313 14:01:19.983343 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057584 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") pod \"77480be5-9488-434e-8105-0fc9237cae46\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") pod \"77480be5-9488-434e-8105-0fc9237cae46\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "77480be5-9488-434e-8105-0fc9237cae46" (UID: "77480be5-9488-434e-8105-0fc9237cae46"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") pod \"77480be5-9488-434e-8105-0fc9237cae46\" (UID: \"77480be5-9488-434e-8105-0fc9237cae46\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.057831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock" (OuterVolumeSpecName: "var-lock") pod "77480be5-9488-434e-8105-0fc9237cae46" (UID: "77480be5-9488-434e-8105-0fc9237cae46"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.058142 4898 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.058165 4898 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/77480be5-9488-434e-8105-0fc9237cae46-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.082209 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "77480be5-9488-434e-8105-0fc9237cae46" (UID: "77480be5-9488-434e-8105-0fc9237cae46"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.158718 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77480be5-9488-434e-8105-0fc9237cae46-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.236124 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.237087 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.360963 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361003 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361034 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361116 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361169 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361176 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361330 4898 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361348 4898 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.361357 4898 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.673684 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.674326 4898 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" exitCode=0 Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.674412 4898 scope.go:117] "RemoveContainer" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.674510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.676519 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.676862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"77480be5-9488-434e-8105-0fc9237cae46","Type":"ContainerDied","Data":"dd7ab09c8cacba65b6094ca53d6f8610851fabd3f291b2f1e2fc1acdbaf4b50f"} Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.676961 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd7ab09c8cacba65b6094ca53d6f8610851fabd3f291b2f1e2fc1acdbaf4b50f" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.692643 4898 scope.go:117] "RemoveContainer" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.709719 4898 scope.go:117] "RemoveContainer" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.726195 4898 scope.go:117] "RemoveContainer" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.738296 4898 scope.go:117] "RemoveContainer" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.751977 4898 scope.go:117] "RemoveContainer" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.772796 4898 scope.go:117] "RemoveContainer" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.773427 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\": container with ID starting with 58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9 not found: ID does not exist" containerID="58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.773514 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9"} err="failed to get container status \"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\": rpc error: code = NotFound desc = could not find container \"58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9\": container with ID starting with 58657703efd5d66740cb17c4bde23f3d5377f197fe751ded7d1b0674b2aac5a9 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.773547 4898 scope.go:117] "RemoveContainer" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.773981 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\": container with ID starting with c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026 not found: ID does not exist" containerID="c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774012 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026"} err="failed to get container status \"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\": rpc error: code = NotFound desc = could not find container \"c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026\": container with ID starting with c001363a263867cbf943756e629fe4786db3fa32b491eb69c1a2de0c220c2026 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774040 4898 scope.go:117] "RemoveContainer" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.774366 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\": container with ID starting with 48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364 not found: ID does not exist" containerID="48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774394 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364"} err="failed to get container status \"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\": rpc error: code = NotFound desc = could not find container \"48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364\": container with ID starting with 48e0fb0fd6b0e6fbd8f84367e587269361e4b1daa36ac5ab80be6898b50f5364 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774411 4898 scope.go:117] "RemoveContainer" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.774709 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\": container with ID starting with c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3 not found: ID does not exist" containerID="c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774741 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3"} err="failed to get container status \"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\": rpc error: code = NotFound desc = could not find container \"c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3\": container with ID starting with c3402bee90fb2f1fd4b05cd7ad82de9339fe65de14a1add9874ca36af679acc3 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.774759 4898 scope.go:117] "RemoveContainer" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.775120 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\": container with ID starting with b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140 not found: ID does not exist" containerID="b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.775146 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140"} err="failed to get container status \"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\": rpc error: code = NotFound desc = could not find container \"b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140\": container with ID starting with b389a5a331cfa6d5b0b435ef1b14a2cc4348ee7d19204dc298a1ffb3fdf00140 not found: ID does not exist" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.775163 4898 scope.go:117] "RemoveContainer" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" Mar 13 14:01:20 crc kubenswrapper[4898]: E0313 14:01:20.775453 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\": container with ID starting with 5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3 not found: ID does not exist" containerID="5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3" Mar 13 14:01:20 crc kubenswrapper[4898]: I0313 14:01:20.775481 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3"} err="failed to get container status \"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\": rpc error: code = NotFound desc = could not find container \"5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3\": container with ID starting with 5b926f61dc7340676b14938f1949572712b67b2286e35b88226587c8856446e3 not found: ID does not exist" Mar 13 14:01:21 crc kubenswrapper[4898]: I0313 14:01:21.752411 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 14:01:22 crc kubenswrapper[4898]: E0313 14:01:22.743922 4898 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" volumeName="registry-storage" Mar 13 14:01:22 crc kubenswrapper[4898]: E0313 14:01:22.902569 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.902467 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903142 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903354 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903352 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903671 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: I0313 14:01:22.903878 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:22 crc kubenswrapper[4898]: E0313 14:01:22.936572 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c6b7034563f31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,LastTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.698837 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d"} Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.698893 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b1b09d1dcd3f368eefe1887da4bd6eca3d4544ca16ac1188de99a9ea197675f2"} Mar 13 14:01:23 crc kubenswrapper[4898]: E0313 14:01:23.699566 4898 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.699555 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.700082 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.700408 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.700816 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:23 crc kubenswrapper[4898]: I0313 14:01:23.846766 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" containerID="cri-o://e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" gracePeriod=15 Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.412374 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.413017 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.413626 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.414381 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.414820 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.415192 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557355 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557445 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557488 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557534 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557571 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557610 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557642 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557698 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557747 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557810 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557845 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.557881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") pod \"b26a4d77-f170-467e-ad96-4741cc5a8f23\" (UID: \"b26a4d77-f170-467e-ad96-4741cc5a8f23\") " Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.558232 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.558430 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.558638 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.559333 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.559432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.565527 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.566178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.566159 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.566737 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567126 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567234 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk" (OuterVolumeSpecName: "kube-api-access-vzgzk") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "kube-api-access-vzgzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567533 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.567738 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.568244 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b26a4d77-f170-467e-ad96-4741cc5a8f23" (UID: "b26a4d77-f170-467e-ad96-4741cc5a8f23"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659834 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzgzk\" (UniqueName: \"kubernetes.io/projected/b26a4d77-f170-467e-ad96-4741cc5a8f23-kube-api-access-vzgzk\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659921 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659939 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659953 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659968 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659982 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.659997 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660012 4898 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660028 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660042 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660056 4898 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b26a4d77-f170-467e-ad96-4741cc5a8f23-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660067 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660080 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.660093 4898 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b26a4d77-f170-467e-ad96-4741cc5a8f23-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708235 4898 generic.go:334] "Generic (PLEG): container finished" podID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" exitCode=0 Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708297 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerDied","Data":"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886"} Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708335 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" event={"ID":"b26a4d77-f170-467e-ad96-4741cc5a8f23","Type":"ContainerDied","Data":"2d5714977afe363a0af3e9631742fe59289a173d68823730a2b889e9e03736c1"} Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708357 4898 scope.go:117] "RemoveContainer" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.708482 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.710229 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.711567 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.711815 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.712059 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.712341 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.725862 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.727448 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.728004 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.728235 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.728452 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.743933 4898 scope.go:117] "RemoveContainer" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" Mar 13 14:01:24 crc kubenswrapper[4898]: E0313 14:01:24.745185 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886\": container with ID starting with e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886 not found: ID does not exist" containerID="e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886" Mar 13 14:01:24 crc kubenswrapper[4898]: I0313 14:01:24.745247 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886"} err="failed to get container status \"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886\": rpc error: code = NotFound desc = could not find container \"e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886\": container with ID starting with e2d4415c83b3a1cbbccfa0bc4f10a48d58d464da90f0fa04517b4115ecd57886 not found: ID does not exist" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.743183 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.744327 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.745026 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.745400 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:25 crc kubenswrapper[4898]: I0313 14:01:25.745703 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.629248 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.629813 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.630556 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.631083 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.631529 4898 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:27 crc kubenswrapper[4898]: I0313 14:01:27.631579 4898 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.632046 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="200ms" Mar 13 14:01:27 crc kubenswrapper[4898]: E0313 14:01:27.832845 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="400ms" Mar 13 14:01:28 crc kubenswrapper[4898]: E0313 14:01:28.234344 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="800ms" Mar 13 14:01:28 crc kubenswrapper[4898]: E0313 14:01:28.746519 4898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.201:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189c6b7034563f31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,LastTimestamp:2026-03-13 14:01:22.935709489 +0000 UTC m=+317.937297728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.035610 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="1.6s" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.335506 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T14:01:29Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.335833 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336093 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336524 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336773 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.336800 4898 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.739637 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.740693 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.741421 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.742342 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.742801 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.743194 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.760394 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.760452 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:29 crc kubenswrapper[4898]: E0313 14:01:29.761017 4898 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:29 crc kubenswrapper[4898]: I0313 14:01:29.762147 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:30 crc kubenswrapper[4898]: E0313 14:01:30.636943 4898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.201:6443: connect: connection refused" interval="3.2s" Mar 13 14:01:30 crc kubenswrapper[4898]: I0313 14:01:30.761517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fb8d1075677d946e836a8100a5ff549ecdb66cbebccbe047025d2252ca4d1e96"} Mar 13 14:01:30 crc kubenswrapper[4898]: I0313 14:01:30.761593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ad7d9bc61eae3398db3cf7bee567629f2bad9723b89d2a1ef836f114247808c6"} Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770283 4898 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fb8d1075677d946e836a8100a5ff549ecdb66cbebccbe047025d2252ca4d1e96" exitCode=0 Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770353 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fb8d1075677d946e836a8100a5ff549ecdb66cbebccbe047025d2252ca4d1e96"} Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770533 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.770555 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:31 crc kubenswrapper[4898]: E0313 14:01:31.770973 4898 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.771218 4898 status_manager.go:851] "Failed to get status for pod" podUID="77480be5-9488-434e-8105-0fc9237cae46" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.771609 4898 status_manager.go:851] "Failed to get status for pod" podUID="a990881e-0caf-4096-a372-4cdad69006c1" pod="openshift-marketplace/community-operators-ppq6v" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-ppq6v\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.771971 4898 status_manager.go:851] "Failed to get status for pod" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" pod="openshift-marketplace/redhat-operators-btkxt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-btkxt\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.772309 4898 status_manager.go:851] "Failed to get status for pod" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" pod="openshift-marketplace/redhat-operators-974qp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-974qp\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:31 crc kubenswrapper[4898]: I0313 14:01:31.772640 4898 status_manager.go:851] "Failed to get status for pod" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" pod="openshift-authentication/oauth-openshift-558db77b4-djn5q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-djn5q\": dial tcp 38.102.83.201:6443: connect: connection refused" Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.778637 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"05f1cfc7544c6b85c39b0f3a9bc14d3f94f3875fa7d3f313988de4b5619142b3"} Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.778978 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80ca95a7aa043adc2ed03e8f0aa22c3a939ff9e8c6eb6bf26e6933e81f9a0f76"} Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.780798 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.782363 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.782450 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec" exitCode=1 Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.782505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec"} Mar 13 14:01:32 crc kubenswrapper[4898]: I0313 14:01:32.783222 4898 scope.go:117] "RemoveContainer" containerID="172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.175242 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.790716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3a20b7cd1a0d1dae57c24d0daf2f93df192226c3651a5dd5b49224cd8f650f2f"} Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.791027 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"948d353e949566c89a7588e2afad191c300fd7166df23f30754dedf178e39312"} Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.794765 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.797804 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 14:01:33 crc kubenswrapper[4898]: I0313 14:01:33.797883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66"} Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808269 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808293 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808457 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9d6527d36e7b8f6344efc84197c8496d6674fb0cea7bd5e9fbfb3a398956bd7d"} Mar 13 14:01:34 crc kubenswrapper[4898]: I0313 14:01:34.808497 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:38 crc kubenswrapper[4898]: I0313 14:01:38.347553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.231466 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.231687 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.231744 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.762654 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.762938 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.778184 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:39 crc kubenswrapper[4898]: I0313 14:01:39.821650 4898 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.850616 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.851470 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.855073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:40 crc kubenswrapper[4898]: I0313 14:01:40.857488 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09f01934-8579-467a-a76f-1bfaffab04bf" Mar 13 14:01:41 crc kubenswrapper[4898]: I0313 14:01:41.858031 4898 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:41 crc kubenswrapper[4898]: I0313 14:01:41.859175 4898 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3dea03f2-846c-4fe2-91f3-67269c416048" Mar 13 14:01:45 crc kubenswrapper[4898]: I0313 14:01:45.767859 4898 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="09f01934-8579-467a-a76f-1bfaffab04bf" Mar 13 14:01:48 crc kubenswrapper[4898]: I0313 14:01:48.694106 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.145438 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.230706 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.232290 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.232364 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 14:01:49 crc kubenswrapper[4898]: I0313 14:01:49.633662 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.148744 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.242063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.394287 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.504417 4898 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.510492 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-djn5q","openshift-marketplace/community-operators-ppq6v","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.510567 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.519124 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.544187 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.544165934 podStartE2EDuration="11.544165934s" podCreationTimestamp="2026-03-13 14:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:01:50.542572356 +0000 UTC m=+345.544160625" watchObservedRunningTime="2026-03-13 14:01:50.544165934 +0000 UTC m=+345.545754233" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.679019 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.791446 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.844213 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 14:01:50 crc kubenswrapper[4898]: I0313 14:01:50.845548 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.137361 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.141366 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.383858 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6757584b5b-nct75"] Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384145 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-content" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384165 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-content" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384201 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-utilities" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384208 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="extract-utilities" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384222 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77480be5-9488-434e-8105-0fc9237cae46" containerName="installer" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384230 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="77480be5-9488-434e-8105-0fc9237cae46" containerName="installer" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384247 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384254 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" Mar 13 14:01:51 crc kubenswrapper[4898]: E0313 14:01:51.384265 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384272 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384378 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" containerName="oauth-openshift" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384393 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a990881e-0caf-4096-a372-4cdad69006c1" containerName="registry-server" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384405 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="77480be5-9488-434e-8105-0fc9237cae46" containerName="installer" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.384840 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.387503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.388151 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.388565 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.388976 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389001 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389148 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.389620 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.390473 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.391812 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.392034 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.392963 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.409378 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.410392 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.417530 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426106 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426183 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-policies\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426209 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-dir\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-login\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426420 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426459 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426510 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csv5s\" (UniqueName: \"kubernetes.io/projected/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-kube-api-access-csv5s\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426638 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426684 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-session\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.426706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-error\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-dir\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-login\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csv5s\" (UniqueName: \"kubernetes.io/projected/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-kube-api-access-csv5s\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527565 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527584 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527571 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-dir\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-session\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-error\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527794 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527841 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527886 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-policies\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.527997 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.528065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.529623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.530082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.530348 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-audit-policies\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.531526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.533177 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.533312 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-error\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.534025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-session\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.534088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-template-login\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.535620 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.537428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.540634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.541502 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.551256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csv5s\" (UniqueName: \"kubernetes.io/projected/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3-kube-api-access-csv5s\") pod \"oauth-openshift-6757584b5b-nct75\" (UID: \"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\") " pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.679893 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.740051 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.755850 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a990881e-0caf-4096-a372-4cdad69006c1" path="/var/lib/kubelet/pods/a990881e-0caf-4096-a372-4cdad69006c1/volumes" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.757845 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b26a4d77-f170-467e-ad96-4741cc5a8f23" path="/var/lib/kubelet/pods/b26a4d77-f170-467e-ad96-4741cc5a8f23/volumes" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.841267 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.886669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.912138 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 14:01:51 crc kubenswrapper[4898]: I0313 14:01:51.985317 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.178007 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.230910 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6757584b5b-nct75"] Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.311743 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.330890 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.357405 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.393460 4898 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.589017 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.668379 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.676769 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.745383 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.767308 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.780993 4898 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.801353 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.964595 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.980316 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:01:52 crc kubenswrapper[4898]: I0313 14:01:52.986960 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.026328 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.083327 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.123414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.142336 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.366124 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.444396 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.540337 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.546186 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.605941 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.606353 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.648819 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.670514 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.671644 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 14:01:53 crc kubenswrapper[4898]: I0313 14:01:53.900622 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.023870 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.037091 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.037095 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.044454 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.070021 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.102720 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.118576 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.127487 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.200404 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.206166 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.211870 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.285716 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.299242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.334490 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.355359 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.383286 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.457368 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.504311 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.526485 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.553528 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.569212 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.596067 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.690198 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.749689 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.826822 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.856214 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.868338 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 14:01:54 crc kubenswrapper[4898]: I0313 14:01:54.997381 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998226 4898 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 14:01:54 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7" Netns:"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:54 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:54 crc kubenswrapper[4898]: > Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998296 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 14:01:54 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7" Netns:"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:54 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:54 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998321 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 14:01:54 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7" Netns:"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:54 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:54 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:54 crc kubenswrapper[4898]: E0313 14:01:54.998381 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7\\\" Netns:\\\"/var/run/netns/812cd27e-a312-4622-b089-bb187d413335\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=297c85adf50243fe2bc11220e35969e23bd77b61111951b7f7ec1477e5ad2fe7;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod \\\"oauth-openshift-6757584b5b-nct75\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.063592 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.150130 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.185161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.268515 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.297494 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.321849 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.342657 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.454113 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.500241 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.537517 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.671927 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.731175 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.773942 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.788156 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.805197 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.865663 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.940504 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:55 crc kubenswrapper[4898]: I0313 14:01:55.941073 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.088306 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.164133 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.195501 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.252855 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.300600 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.432360 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.474698 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.510105 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.556495 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.677670 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.759199 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.807574 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.902999 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:01:56 crc kubenswrapper[4898]: I0313 14:01:56.912227 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.102462 4898 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.165708 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.327931 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.501315 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.595126 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.620699 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.716061 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.724082 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.743233 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.781840 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.858287 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.878082 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.917827 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.942682 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.956767 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 14:01:57 crc kubenswrapper[4898]: I0313 14:01:57.997633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.022179 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.037178 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.072705 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.184206 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.193854 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.225632 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.471167 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.560431 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.577492 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.619849 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.674319 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.688293 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.695959 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.803384 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.825136 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.874838 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.896450 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 14:01:58 crc kubenswrapper[4898]: I0313 14:01:58.978652 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.051679 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.135103 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.156948 4898 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 13 14:01:59 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0" Netns:"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:59 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:59 crc kubenswrapper[4898]: > Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.157234 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 13 14:01:59 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0" Netns:"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:59 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:59 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.157262 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 13 14:01:59 crc kubenswrapper[4898]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0" Netns:"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Path:"" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod "oauth-openshift-6757584b5b-nct75" not found Mar 13 14:01:59 crc kubenswrapper[4898]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 13 14:01:59 crc kubenswrapper[4898]: > pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:01:59 crc kubenswrapper[4898]: E0313 14:01:59.157321 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"oauth-openshift-6757584b5b-nct75_openshift-authentication(3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_oauth-openshift-6757584b5b-nct75_openshift-authentication_3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3_0(22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0): error adding pod openshift-authentication_oauth-openshift-6757584b5b-nct75 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0\\\" Netns:\\\"/var/run/netns/9761714d-3bdd-4b3c-ad53-09e2640d19f7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-authentication;K8S_POD_NAME=oauth-openshift-6757584b5b-nct75;K8S_POD_INFRA_CONTAINER_ID=22080dad93543fe092c690ed0f72da0b48f8ea429177758465538eb0ff5dbdc0;K8S_POD_UID=3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-authentication/oauth-openshift-6757584b5b-nct75] networking: Multus: [openshift-authentication/oauth-openshift-6757584b5b-nct75/3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod oauth-openshift-6757584b5b-nct75 in out of cluster comm: pod \\\"oauth-openshift-6757584b5b-nct75\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.182693 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.231473 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.231529 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.231570 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.232121 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.232252 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66" gracePeriod=30 Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.272908 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.322067 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.341517 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.343775 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.391759 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.418045 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.431754 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.494566 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.523185 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.539325 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.683571 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.731316 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.762936 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.791947 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.879656 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.905194 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 14:01:59 crc kubenswrapper[4898]: I0313 14:01:59.912294 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.051186 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.054236 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.128795 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.132063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.194009 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.223130 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.250638 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.260865 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.283168 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.305933 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.385224 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.486114 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.543946 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.696616 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.740479 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.789993 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.815758 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.820097 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.830963 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.845512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.903664 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.922716 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.924979 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 14:02:00 crc kubenswrapper[4898]: I0313 14:02:00.954287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.000439 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.016230 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.105960 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.116165 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.127966 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.222559 4898 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.252291 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.284482 4898 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.310779 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.318503 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.326091 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.341393 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.383989 4898 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.384343 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" gracePeriod=5 Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.390368 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.540461 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.559643 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.569311 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.640802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.657443 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.710055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.811964 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 14:02:01 crc kubenswrapper[4898]: I0313 14:02:01.878759 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.103282 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.171877 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.205693 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.208609 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.238471 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.251588 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.291526 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.379841 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.451285 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.543318 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.638400 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.689229 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.740251 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.758155 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.925416 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 14:02:02 crc kubenswrapper[4898]: I0313 14:02:02.974037 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.096848 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.163354 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.282214 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.482452 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.486850 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 14:02:03 crc kubenswrapper[4898]: I0313 14:02:03.749253 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.177120 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.278221 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.700727 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 14:02:04 crc kubenswrapper[4898]: I0313 14:02:04.819675 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 14:02:05 crc kubenswrapper[4898]: I0313 14:02:05.544393 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.179960 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.509286 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.683288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.970578 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 14:02:06 crc kubenswrapper[4898]: I0313 14:02:06.970665 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000244 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000561 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" exitCode=137 Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000613 4898 scope.go:117] "RemoveContainer" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.000619 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.014515 4898 scope.go:117] "RemoveContainer" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" Mar 13 14:02:07 crc kubenswrapper[4898]: E0313 14:02:07.014889 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d\": container with ID starting with ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d not found: ID does not exist" containerID="ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.014931 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d"} err="failed to get container status \"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d\": rpc error: code = NotFound desc = could not find container \"ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d\": container with ID starting with ec7277e9dfa9beb5985dfe9d6e8c02eb3980ab021d8a008b49c5b8550fbbbb3d not found: ID does not exist" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157646 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157730 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157831 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157854 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.157941 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158077 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158328 4898 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158357 4898 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158384 4898 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.158407 4898 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.168050 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.259594 4898 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.262563 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.303640 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 14:02:07 crc kubenswrapper[4898]: I0313 14:02:07.746912 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 14:02:12 crc kubenswrapper[4898]: I0313 14:02:12.739439 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:12 crc kubenswrapper[4898]: I0313 14:02:12.740476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:13 crc kubenswrapper[4898]: I0313 14:02:13.015787 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6757584b5b-nct75"] Mar 13 14:02:13 crc kubenswrapper[4898]: I0313 14:02:13.037060 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" event={"ID":"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3","Type":"ContainerStarted","Data":"d0641b0799632094c4d4cd21e5e97b5b85d25b38c9ca90698da38dd1d008c2ad"} Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.046781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" event={"ID":"3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3","Type":"ContainerStarted","Data":"2d8bff9d69ce0d19fe0744cd41bbe73c255e4ee58f95dce997a9b7c3ed1203e1"} Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.047257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.055420 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" Mar 13 14:02:14 crc kubenswrapper[4898]: I0313 14:02:14.079448 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podStartSLOduration=76.079429502 podStartE2EDuration="1m16.079429502s" podCreationTimestamp="2026-03-13 14:00:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:02:14.07517648 +0000 UTC m=+369.076764819" watchObservedRunningTime="2026-03-13 14:02:14.079429502 +0000 UTC m=+369.081017761" Mar 13 14:02:27 crc kubenswrapper[4898]: I0313 14:02:27.145966 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" exitCode=0 Mar 13 14:02:27 crc kubenswrapper[4898]: I0313 14:02:27.146089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerDied","Data":"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67"} Mar 13 14:02:27 crc kubenswrapper[4898]: I0313 14:02:27.147061 4898 scope.go:117] "RemoveContainer" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:02:28 crc kubenswrapper[4898]: I0313 14:02:28.153330 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerStarted","Data":"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24"} Mar 13 14:02:28 crc kubenswrapper[4898]: I0313 14:02:28.153936 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:02:28 crc kubenswrapper[4898]: I0313 14:02:28.161011 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.173623 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.175751 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177249 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177329 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66" exitCode=137 Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177387 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66"} Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177445 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471"} Mar 13 14:02:30 crc kubenswrapper[4898]: I0313 14:02:30.177475 4898 scope.go:117] "RemoveContainer" containerID="172c20ea4299e1b913b2675ac14f7ec3a4c80a0f6f2428af0d030aff64eb89ec" Mar 13 14:02:31 crc kubenswrapper[4898]: I0313 14:02:31.185117 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 14:02:31 crc kubenswrapper[4898]: I0313 14:02:31.185953 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 14:02:38 crc kubenswrapper[4898]: I0313 14:02:38.347546 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:39 crc kubenswrapper[4898]: I0313 14:02:39.232946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:39 crc kubenswrapper[4898]: I0313 14:02:39.243146 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:40 crc kubenswrapper[4898]: I0313 14:02:40.256761 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.578402 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:02:47 crc kubenswrapper[4898]: E0313 14:02:47.578984 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.578995 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.579083 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.579417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.581891 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.582315 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.582352 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.605605 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.640588 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.640877 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" containerID="cri-o://11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" gracePeriod=30 Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.643548 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.643751 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" containerID="cri-o://1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" gracePeriod=30 Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.756701 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"auto-csr-approver-29556842-7h9s5\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.857604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"auto-csr-approver-29556842-7h9s5\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.879634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"auto-csr-approver-29556842-7h9s5\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:47 crc kubenswrapper[4898]: I0313 14:02:47.895880 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.109082 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.171834 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261105 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261148 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261191 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") pod \"0561a31b-c67c-4410-8845-d47e4533be0a\" (UID: \"0561a31b-c67c-4410-8845-d47e4533be0a\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.261835 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca" (OuterVolumeSpecName: "client-ca") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.262008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config" (OuterVolumeSpecName: "config") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.265566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.265586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk" (OuterVolumeSpecName: "kube-api-access-h6mvk") pod "0561a31b-c67c-4410-8845-d47e4533be0a" (UID: "0561a31b-c67c-4410-8845-d47e4533be0a"). InnerVolumeSpecName "kube-api-access-h6mvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297008 4898 generic.go:334] "Generic (PLEG): container finished" podID="0561a31b-c67c-4410-8845-d47e4533be0a" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" exitCode=0 Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297090 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297091 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerDied","Data":"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297209 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w" event={"ID":"0561a31b-c67c-4410-8845-d47e4533be0a","Type":"ContainerDied","Data":"3f35c6846af9efc90831b47db53412c7613bf81bc4a1bb77055f1c9c4645cef8"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.297233 4898 scope.go:117] "RemoveContainer" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298356 4898 generic.go:334] "Generic (PLEG): container finished" podID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" exitCode=0 Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298376 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerDied","Data":"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" event={"ID":"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf","Type":"ContainerDied","Data":"ba22e3087795c8be037e21c8d47195aba889300c63f1adc91b0790f8732ad1d9"} Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.298472 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.314620 4898 scope.go:117] "RemoveContainer" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.314979 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97\": container with ID starting with 11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97 not found: ID does not exist" containerID="11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.315009 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97"} err="failed to get container status \"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97\": rpc error: code = NotFound desc = could not find container \"11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97\": container with ID starting with 11185c5dd36acbad20bedb23b2365654ca410b78e2e18b38a6ef92209812ef97 not found: ID does not exist" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.315029 4898 scope.go:117] "RemoveContainer" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.322139 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.334173 4898 scope.go:117] "RemoveContainer" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.334479 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76556fcf77-lhw2w"] Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.334805 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989\": container with ID starting with 1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989 not found: ID does not exist" containerID="1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.334849 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989"} err="failed to get container status \"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989\": rpc error: code = NotFound desc = could not find container \"1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989\": container with ID starting with 1c8b4c66a938ec22ac3bf1cf597a602d6bd12a58cd3600512849d5a7fae14989 not found: ID does not exist" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.348675 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:02:48 crc kubenswrapper[4898]: W0313 14:02:48.354091 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a9b9a59_64ad_4602_88da_91583ec126dc.slice/crio-4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426 WatchSource:0}: Error finding container 4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426: Status 404 returned error can't find the container with id 4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426 Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.363524 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364458 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364470 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.364682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") pod \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\" (UID: \"7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf\") " Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365289 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365317 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0561a31b-c67c-4410-8845-d47e4533be0a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365328 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0561a31b-c67c-4410-8845-d47e4533be0a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365340 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365351 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6mvk\" (UniqueName: \"kubernetes.io/projected/0561a31b-c67c-4410-8845-d47e4533be0a-kube-api-access-h6mvk\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365396 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.365465 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config" (OuterVolumeSpecName: "config") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.368216 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km" (OuterVolumeSpecName: "kube-api-access-9h6km") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "kube-api-access-9h6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.368565 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" (UID: "7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466546 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h6km\" (UniqueName: \"kubernetes.io/projected/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-kube-api-access-9h6km\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466572 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466581 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.466590 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.625049 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.630849 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-db7fb4ff9-qlmhf"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900003 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.900589 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900666 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: E0313 14:02:48.900747 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900803 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.900997 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" containerName="route-controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.901130 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" containerName="controller-manager" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.901671 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.903252 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.903470 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.903687 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.904001 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.904153 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908578 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908782 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908805 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.908632 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.909501 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.910034 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.910301 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.910522 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.911116 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.913595 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.918422 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.948448 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971358 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971409 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971600 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971628 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971728 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971785 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:48 crc kubenswrapper[4898]: I0313 14:02:48.971815 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072860 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072930 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072954 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.072995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.073026 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.073043 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.073070 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.074397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.074883 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.075250 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.075986 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.076009 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.078529 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.078725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.088137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"route-controller-manager-76946b564d-4m5gm\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.103856 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"controller-manager-d6f97d578-d974n\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.261589 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.271001 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.309139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" event={"ID":"8a9b9a59-64ad-4602-88da-91583ec126dc","Type":"ContainerStarted","Data":"4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426"} Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.683019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:02:49 crc kubenswrapper[4898]: W0313 14:02:49.684866 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9de415_59e6_40a1_8392_827c617e2ce8.slice/crio-48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84 WatchSource:0}: Error finding container 48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84: Status 404 returned error can't find the container with id 48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84 Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.747124 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0561a31b-c67c-4410-8845-d47e4533be0a" path="/var/lib/kubelet/pods/0561a31b-c67c-4410-8845-d47e4533be0a/volumes" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.747997 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf" path="/var/lib/kubelet/pods/7a72588b-5ed5-4ab7-bbe0-c0b6e08eedbf/volumes" Mar 13 14:02:49 crc kubenswrapper[4898]: I0313 14:02:49.748528 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.315378 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerID="529194ce4ac0e19d00515e6fc6f6984803e8a03afdee3263ba4e434f1a13a57b" exitCode=0 Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.315711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" event={"ID":"8a9b9a59-64ad-4602-88da-91583ec126dc","Type":"ContainerDied","Data":"529194ce4ac0e19d00515e6fc6f6984803e8a03afdee3263ba4e434f1a13a57b"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.317058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerStarted","Data":"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.317093 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerStarted","Data":"48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.317797 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.318856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerStarted","Data":"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.318876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerStarted","Data":"92accbb2ec5b0ce2cc04a20381f279e22f42b2968b782d019c7ef8631795f69d"} Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.319299 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.322033 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.347327 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" podStartSLOduration=3.347307555 podStartE2EDuration="3.347307555s" podCreationTimestamp="2026-03-13 14:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:02:50.343141141 +0000 UTC m=+405.344729400" watchObservedRunningTime="2026-03-13 14:02:50.347307555 +0000 UTC m=+405.348895794" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.370311 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" podStartSLOduration=3.370294993 podStartE2EDuration="3.370294993s" podCreationTimestamp="2026-03-13 14:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:02:50.368667762 +0000 UTC m=+405.370256011" watchObservedRunningTime="2026-03-13 14:02:50.370294993 +0000 UTC m=+405.371883232" Mar 13 14:02:50 crc kubenswrapper[4898]: I0313 14:02:50.561796 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.635140 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.806169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") pod \"8a9b9a59-64ad-4602-88da-91583ec126dc\" (UID: \"8a9b9a59-64ad-4602-88da-91583ec126dc\") " Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.813655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2" (OuterVolumeSpecName: "kube-api-access-hkkd2") pod "8a9b9a59-64ad-4602-88da-91583ec126dc" (UID: "8a9b9a59-64ad-4602-88da-91583ec126dc"). InnerVolumeSpecName "kube-api-access-hkkd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:02:51 crc kubenswrapper[4898]: I0313 14:02:51.907764 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkkd2\" (UniqueName: \"kubernetes.io/projected/8a9b9a59-64ad-4602-88da-91583ec126dc-kube-api-access-hkkd2\") on node \"crc\" DevicePath \"\"" Mar 13 14:02:52 crc kubenswrapper[4898]: I0313 14:02:52.334959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" event={"ID":"8a9b9a59-64ad-4602-88da-91583ec126dc","Type":"ContainerDied","Data":"4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426"} Mar 13 14:02:52 crc kubenswrapper[4898]: I0313 14:02:52.335045 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556842-7h9s5" Mar 13 14:02:52 crc kubenswrapper[4898]: I0313 14:02:52.335060 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d2585911a8b2947eab5790240e88ed8015c03581cb3080f1739046921122426" Mar 13 14:03:12 crc kubenswrapper[4898]: I0313 14:03:12.913215 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:03:12 crc kubenswrapper[4898]: I0313 14:03:12.914027 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" containerID="cri-o://1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" gracePeriod=30 Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.415112 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458688 4898 generic.go:334] "Generic (PLEG): container finished" podID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" exitCode=0 Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerDied","Data":"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1"} Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458776 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" event={"ID":"3f9de415-59e6-40a1-8392-827c617e2ce8","Type":"ContainerDied","Data":"48528b351d9c8fb7ddf3057320b84878a31c1815533952151ea60cfed48b3a84"} Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458786 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6f97d578-d974n" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.458800 4898 scope.go:117] "RemoveContainer" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.476675 4898 scope.go:117] "RemoveContainer" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" Mar 13 14:03:13 crc kubenswrapper[4898]: E0313 14:03:13.477332 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1\": container with ID starting with 1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1 not found: ID does not exist" containerID="1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.477379 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1"} err="failed to get container status \"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1\": rpc error: code = NotFound desc = could not find container \"1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1\": container with ID starting with 1fd3951d1aff7bf70501c814cdd5160a62e5847ba31d11cf12e32e1f23b242e1 not found: ID does not exist" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586008 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586093 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586152 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.586261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") pod \"3f9de415-59e6-40a1-8392-827c617e2ce8\" (UID: \"3f9de415-59e6-40a1-8392-827c617e2ce8\") " Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.587015 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.587045 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config" (OuterVolumeSpecName: "config") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.587436 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca" (OuterVolumeSpecName: "client-ca") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.593541 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.594112 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d" (OuterVolumeSpecName: "kube-api-access-zkj5d") pod "3f9de415-59e6-40a1-8392-827c617e2ce8" (UID: "3f9de415-59e6-40a1-8392-827c617e2ce8"). InnerVolumeSpecName "kube-api-access-zkj5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687262 4898 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687298 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f9de415-59e6-40a1-8392-827c617e2ce8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687310 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687325 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkj5d\" (UniqueName: \"kubernetes.io/projected/3f9de415-59e6-40a1-8392-827c617e2ce8-kube-api-access-zkj5d\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.687339 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f9de415-59e6-40a1-8392-827c617e2ce8-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.787157 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:03:13 crc kubenswrapper[4898]: I0313 14:03:13.792057 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d6f97d578-d974n"] Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.927600 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-797d9c85c-m5jdj"] Mar 13 14:03:14 crc kubenswrapper[4898]: E0313 14:03:14.927875 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.927915 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" Mar 13 14:03:14 crc kubenswrapper[4898]: E0313 14:03:14.927938 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerName="oc" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.927949 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerName="oc" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.928095 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" containerName="oc" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.928112 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" containerName="controller-manager" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.928581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.931022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.931236 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.931459 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.934603 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.934844 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.934844 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.940886 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 14:03:14 crc kubenswrapper[4898]: I0313 14:03:14.963923 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-797d9c85c-m5jdj"] Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.004617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-client-ca\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.004679 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s75w9\" (UniqueName: \"kubernetes.io/projected/2ec8c09e-475e-4c4b-86ec-38388754240f-kube-api-access-s75w9\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.004944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-config\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.005117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec8c09e-475e-4c4b-86ec-38388754240f-serving-cert\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.005213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-proxy-ca-bundles\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.105594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec8c09e-475e-4c4b-86ec-38388754240f-serving-cert\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.105663 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-proxy-ca-bundles\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.105710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-client-ca\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.106986 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s75w9\" (UniqueName: \"kubernetes.io/projected/2ec8c09e-475e-4c4b-86ec-38388754240f-kube-api-access-s75w9\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.107100 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-client-ca\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.107211 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-config\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.107275 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-proxy-ca-bundles\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.108644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ec8c09e-475e-4c4b-86ec-38388754240f-config\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.119054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ec8c09e-475e-4c4b-86ec-38388754240f-serving-cert\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.121268 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s75w9\" (UniqueName: \"kubernetes.io/projected/2ec8c09e-475e-4c4b-86ec-38388754240f-kube-api-access-s75w9\") pod \"controller-manager-797d9c85c-m5jdj\" (UID: \"2ec8c09e-475e-4c4b-86ec-38388754240f\") " pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.257075 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.723319 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-797d9c85c-m5jdj"] Mar 13 14:03:15 crc kubenswrapper[4898]: I0313 14:03:15.750609 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9de415-59e6-40a1-8392-827c617e2ce8" path="/var/lib/kubelet/pods/3f9de415-59e6-40a1-8392-827c617e2ce8/volumes" Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.508777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" event={"ID":"2ec8c09e-475e-4c4b-86ec-38388754240f","Type":"ContainerStarted","Data":"ea168832255bbc86d5c5ce93816cc05501f0eb11f90b0102b5e3bb1a0fe8d967"} Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.509175 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" event={"ID":"2ec8c09e-475e-4c4b-86ec-38388754240f","Type":"ContainerStarted","Data":"a5d66ae112a6b6960f8bede4d5c8a63d427f14d3ef98e541dc2b7652217eac67"} Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.509206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.513763 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" Mar 13 14:03:16 crc kubenswrapper[4898]: I0313 14:03:16.534134 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podStartSLOduration=4.534104613 podStartE2EDuration="4.534104613s" podCreationTimestamp="2026-03-13 14:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:16.528268427 +0000 UTC m=+431.529856676" watchObservedRunningTime="2026-03-13 14:03:16.534104613 +0000 UTC m=+431.535692862" Mar 13 14:03:19 crc kubenswrapper[4898]: I0313 14:03:19.134303 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:03:19 crc kubenswrapper[4898]: I0313 14:03:19.134991 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.121507 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7zcdz"] Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.123204 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.136509 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7zcdz"] Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-tls\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180418 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-certificates\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180456 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-trusted-ca\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180486 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-655ml\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-kube-api-access-655ml\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ba480ebb-f079-4888-857b-d917e4a9b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180549 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ba480ebb-f079-4888-857b-d917e4a9b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.180762 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.212989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281675 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-tls\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-certificates\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-trusted-ca\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-655ml\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-kube-api-access-655ml\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ba480ebb-f079-4888-857b-d917e4a9b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.281949 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ba480ebb-f079-4888-857b-d917e4a9b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.282673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ba480ebb-f079-4888-857b-d917e4a9b13b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.284123 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-trusted-ca\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.284226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-certificates\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.287546 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ba480ebb-f079-4888-857b-d917e4a9b13b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.290527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-registry-tls\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.301350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-655ml\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-kube-api-access-655ml\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.305650 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ba480ebb-f079-4888-857b-d917e4a9b13b-bound-sa-token\") pod \"image-registry-66df7c8f76-7zcdz\" (UID: \"ba480ebb-f079-4888-857b-d917e4a9b13b\") " pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.460057 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:20 crc kubenswrapper[4898]: I0313 14:03:20.880765 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7zcdz"] Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.542251 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" event={"ID":"ba480ebb-f079-4888-857b-d917e4a9b13b","Type":"ContainerStarted","Data":"aa888781f304aed2b9c49d6608ce2ba009cb134b7b2da2e3ed2d684daedda363"} Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.542302 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" event={"ID":"ba480ebb-f079-4888-857b-d917e4a9b13b","Type":"ContainerStarted","Data":"7ba354dc972ae3e18545d089f815f3751645634c4a62217a7c6f39c3f9c1fc0c"} Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.542426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:21 crc kubenswrapper[4898]: I0313 14:03:21.560698 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" podStartSLOduration=1.560674404 podStartE2EDuration="1.560674404s" podCreationTimestamp="2026-03-13 14:03:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:21.558020698 +0000 UTC m=+436.559608937" watchObservedRunningTime="2026-03-13 14:03:21.560674404 +0000 UTC m=+436.562262663" Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.327636 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.328804 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-btkxt" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" containerID="cri-o://72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182" gracePeriod=2 Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.621532 4898 generic.go:334] "Generic (PLEG): container finished" podID="7794a943-5fec-485e-86bf-f104ed6ae070" containerID="72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182" exitCode=0 Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.621632 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182"} Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.846420 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.933589 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.934144 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" containerID="cri-o://4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" gracePeriod=30 Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.973485 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") pod \"7794a943-5fec-485e-86bf-f104ed6ae070\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.973538 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") pod \"7794a943-5fec-485e-86bf-f104ed6ae070\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.973592 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") pod \"7794a943-5fec-485e-86bf-f104ed6ae070\" (UID: \"7794a943-5fec-485e-86bf-f104ed6ae070\") " Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.974817 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities" (OuterVolumeSpecName: "utilities") pod "7794a943-5fec-485e-86bf-f104ed6ae070" (UID: "7794a943-5fec-485e-86bf-f104ed6ae070"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:32 crc kubenswrapper[4898]: I0313 14:03:32.982061 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5" (OuterVolumeSpecName: "kube-api-access-66sn5") pod "7794a943-5fec-485e-86bf-f104ed6ae070" (UID: "7794a943-5fec-485e-86bf-f104ed6ae070"). InnerVolumeSpecName "kube-api-access-66sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.074638 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66sn5\" (UniqueName: \"kubernetes.io/projected/7794a943-5fec-485e-86bf-f104ed6ae070-kube-api-access-66sn5\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.074925 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.120163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7794a943-5fec-485e-86bf-f104ed6ae070" (UID: "7794a943-5fec-485e-86bf-f104ed6ae070"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.176049 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7794a943-5fec-485e-86bf-f104ed6ae070-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.537156 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.628844 4898 generic.go:334] "Generic (PLEG): container finished" podID="278b669f-19b0-49e8-9a35-d583ac818d86" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" exitCode=0 Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.628934 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.628953 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerDied","Data":"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9"} Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.629115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm" event={"ID":"278b669f-19b0-49e8-9a35-d583ac818d86","Type":"ContainerDied","Data":"92accbb2ec5b0ce2cc04a20381f279e22f42b2968b782d019c7ef8631795f69d"} Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.629146 4898 scope.go:117] "RemoveContainer" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.632141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-btkxt" event={"ID":"7794a943-5fec-485e-86bf-f104ed6ae070","Type":"ContainerDied","Data":"9107b24c316c7c4b1a47858e8c7bbd33ee4b48e192b020a209e38129a6fd6f89"} Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.632248 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-btkxt" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.650532 4898 scope.go:117] "RemoveContainer" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" Mar 13 14:03:33 crc kubenswrapper[4898]: E0313 14:03:33.652468 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9\": container with ID starting with 4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9 not found: ID does not exist" containerID="4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.652522 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9"} err="failed to get container status \"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9\": rpc error: code = NotFound desc = could not find container \"4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9\": container with ID starting with 4088ee8afc7274cc475dfb2367bd50ba012954a19435c83d57f81159d69220a9 not found: ID does not exist" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.652556 4898 scope.go:117] "RemoveContainer" containerID="72013575671e67bc50654bdc19c7f86358a2ef9f58c688c055736ec45a15a182" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.670813 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.675283 4898 scope.go:117] "RemoveContainer" containerID="67b3d784e0ee63e0bd8e175cf0b8537e1ca35f7833ddbd4c0468016c4030500b" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.676163 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-btkxt"] Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681413 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681470 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.681518 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") pod \"278b669f-19b0-49e8-9a35-d583ac818d86\" (UID: \"278b669f-19b0-49e8-9a35-d583ac818d86\") " Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.682752 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca" (OuterVolumeSpecName: "client-ca") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.682874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config" (OuterVolumeSpecName: "config") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.689730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.689770 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh" (OuterVolumeSpecName: "kube-api-access-24rjh") pod "278b669f-19b0-49e8-9a35-d583ac818d86" (UID: "278b669f-19b0-49e8-9a35-d583ac818d86"). InnerVolumeSpecName "kube-api-access-24rjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.701692 4898 scope.go:117] "RemoveContainer" containerID="b05ab9f2e4156ffc544ae5bf8d297fc15f45604caf9f75b4b1a59e033d78a2fc" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.753112 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" path="/var/lib/kubelet/pods/7794a943-5fec-485e-86bf-f104ed6ae070/volumes" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.783998 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.784066 4898 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/278b669f-19b0-49e8-9a35-d583ac818d86-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.784093 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24rjh\" (UniqueName: \"kubernetes.io/projected/278b669f-19b0-49e8-9a35-d583ac818d86-kube-api-access-24rjh\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.784120 4898 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/278b669f-19b0-49e8-9a35-d583ac818d86-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.954985 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:03:33 crc kubenswrapper[4898]: I0313 14:03:33.968055 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76946b564d-4m5gm"] Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.937649 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2"] Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.937960 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.937980 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.938000 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-utilities" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938010 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-utilities" Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.938028 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-content" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938040 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="extract-content" Mar 13 14:03:34 crc kubenswrapper[4898]: E0313 14:03:34.938065 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938075 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938220 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" containerName="route-controller-manager" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938245 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7794a943-5fec-485e-86bf-f104ed6ae070" containerName="registry-server" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.938729 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.940693 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.940763 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.941622 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.942938 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.943128 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.943443 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 14:03:34 crc kubenswrapper[4898]: I0313 14:03:34.958095 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2"] Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-client-ca\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-config\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095406 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-serving-cert\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.095447 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9m7b\" (UniqueName: \"kubernetes.io/projected/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-kube-api-access-t9m7b\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-serving-cert\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196157 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9m7b\" (UniqueName: \"kubernetes.io/projected/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-kube-api-access-t9m7b\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-client-ca\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.196221 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-config\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.197421 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-client-ca\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.198302 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-config\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.201685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-serving-cert\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.212242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9m7b\" (UniqueName: \"kubernetes.io/projected/0376a3d3-f3a2-4674-a7f9-b06a9e62836e-kube-api-access-t9m7b\") pod \"route-controller-manager-7b756f97f-wjsf2\" (UID: \"0376a3d3-f3a2-4674-a7f9-b06a9e62836e\") " pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.402476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.747278 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278b669f-19b0-49e8-9a35-d583ac818d86" path="/var/lib/kubelet/pods/278b669f-19b0-49e8-9a35-d583ac818d86/volumes" Mar 13 14:03:35 crc kubenswrapper[4898]: I0313 14:03:35.826768 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2"] Mar 13 14:03:35 crc kubenswrapper[4898]: W0313 14:03:35.837577 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0376a3d3_f3a2_4674_a7f9_b06a9e62836e.slice/crio-c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0 WatchSource:0}: Error finding container c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0: Status 404 returned error can't find the container with id c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0 Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.655001 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerStarted","Data":"0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e"} Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.655050 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerStarted","Data":"c87f84f9f27a2d6cb574908c3031e9c0f2914a96f5af157028432a387025a4a0"} Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.655546 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.663452 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 14:03:36 crc kubenswrapper[4898]: I0313 14:03:36.681600 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podStartSLOduration=4.68156389 podStartE2EDuration="4.68156389s" podCreationTimestamp="2026-03-13 14:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:36.676129764 +0000 UTC m=+451.677718073" watchObservedRunningTime="2026-03-13 14:03:36.68156389 +0000 UTC m=+451.683152169" Mar 13 14:03:40 crc kubenswrapper[4898]: I0313 14:03:40.466013 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" Mar 13 14:03:40 crc kubenswrapper[4898]: I0313 14:03:40.531394 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 14:03:49 crc kubenswrapper[4898]: I0313 14:03:49.134779 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:03:49 crc kubenswrapper[4898]: I0313 14:03:49.135188 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.920634 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.921740 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dvvz2" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" containerID="cri-o://8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.928735 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.929053 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twh8h" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" containerID="cri-o://c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.934683 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.935065 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" containerID="cri-o://fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.950522 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.950833 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h97c9" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" containerID="cri-o://cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.958673 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7ng7"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.959946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.963740 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.964088 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-974qp" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" containerID="cri-o://b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" gracePeriod=30 Mar 13 14:03:57 crc kubenswrapper[4898]: I0313 14:03:57.983736 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7ng7"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.034506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h75b\" (UniqueName: \"kubernetes.io/projected/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-kube-api-access-9h75b\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.034593 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.034629 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.135939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.136006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h75b\" (UniqueName: \"kubernetes.io/projected/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-kube-api-access-9h75b\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.136078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.137513 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.149142 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.153809 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h75b\" (UniqueName: \"kubernetes.io/projected/b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320-kube-api-access-9h75b\") pod \"marketplace-operator-79b997595-z7ng7\" (UID: \"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320\") " pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.302496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.397129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.430575 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.431022 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.431275 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" cmd=["grpc_health_probe","-addr=:50051"] Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.431301 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-h97c9" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.441334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") pod \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.441392 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") pod \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.441459 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") pod \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\" (UID: \"8f81bcfc-3c35-48e8-a584-961351e8c0e2\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.442391 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities" (OuterVolumeSpecName: "utilities") pod "8f81bcfc-3c35-48e8-a584-961351e8c0e2" (UID: "8f81bcfc-3c35-48e8-a584-961351e8c0e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.455398 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728" (OuterVolumeSpecName: "kube-api-access-5x728") pod "8f81bcfc-3c35-48e8-a584-961351e8c0e2" (UID: "8f81bcfc-3c35-48e8-a584-961351e8c0e2"). InnerVolumeSpecName "kube-api-access-5x728". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.483358 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.485702 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.488544 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.490363 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.517318 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f81bcfc-3c35-48e8-a584-961351e8c0e2" (UID: "8f81bcfc-3c35-48e8-a584-961351e8c0e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.542926 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") pod \"183d86e9-cd5c-45ed-a460-bb6169e07c72\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.543363 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.543380 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x728\" (UniqueName: \"kubernetes.io/projected/8f81bcfc-3c35-48e8-a584-961351e8c0e2-kube-api-access-5x728\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.543390 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f81bcfc-3c35-48e8-a584-961351e8c0e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644280 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") pod \"183d86e9-cd5c-45ed-a460-bb6169e07c72\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644337 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") pod \"43acaee8-efc8-4156-b28c-b493f241ac53\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644364 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") pod \"43acaee8-efc8-4156-b28c-b493f241ac53\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644391 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") pod \"43acaee8-efc8-4156-b28c-b493f241ac53\" (UID: \"43acaee8-efc8-4156-b28c-b493f241ac53\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644473 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") pod \"183d86e9-cd5c-45ed-a460-bb6169e07c72\" (UID: \"183d86e9-cd5c-45ed-a460-bb6169e07c72\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") pod \"0a78868f-1786-430d-8df8-18bb1c2019b3\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644550 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") pod \"f85f72a8-3887-4867-8a9c-649992ce23f1\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644591 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") pod \"0a78868f-1786-430d-8df8-18bb1c2019b3\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") pod \"f85f72a8-3887-4867-8a9c-649992ce23f1\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644643 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") pod \"f85f72a8-3887-4867-8a9c-649992ce23f1\" (UID: \"f85f72a8-3887-4867-8a9c-649992ce23f1\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.644682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") pod \"0a78868f-1786-430d-8df8-18bb1c2019b3\" (UID: \"0a78868f-1786-430d-8df8-18bb1c2019b3\") " Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.645773 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities" (OuterVolumeSpecName: "utilities") pod "43acaee8-efc8-4156-b28c-b493f241ac53" (UID: "43acaee8-efc8-4156-b28c-b493f241ac53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.645808 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities" (OuterVolumeSpecName: "utilities") pod "f85f72a8-3887-4867-8a9c-649992ce23f1" (UID: "f85f72a8-3887-4867-8a9c-649992ce23f1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.646131 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities" (OuterVolumeSpecName: "utilities") pod "183d86e9-cd5c-45ed-a460-bb6169e07c72" (UID: "183d86e9-cd5c-45ed-a460-bb6169e07c72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.648632 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0a78868f-1786-430d-8df8-18bb1c2019b3" (UID: "0a78868f-1786-430d-8df8-18bb1c2019b3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.649177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l" (OuterVolumeSpecName: "kube-api-access-m5s4l") pod "f85f72a8-3887-4867-8a9c-649992ce23f1" (UID: "f85f72a8-3887-4867-8a9c-649992ce23f1"). InnerVolumeSpecName "kube-api-access-m5s4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.649534 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0a78868f-1786-430d-8df8-18bb1c2019b3" (UID: "0a78868f-1786-430d-8df8-18bb1c2019b3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.649199 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2" (OuterVolumeSpecName: "kube-api-access-zhlq2") pod "43acaee8-efc8-4156-b28c-b493f241ac53" (UID: "43acaee8-efc8-4156-b28c-b493f241ac53"). InnerVolumeSpecName "kube-api-access-zhlq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.652078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v" (OuterVolumeSpecName: "kube-api-access-vnv8v") pod "183d86e9-cd5c-45ed-a460-bb6169e07c72" (UID: "183d86e9-cd5c-45ed-a460-bb6169e07c72"). InnerVolumeSpecName "kube-api-access-vnv8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.652878 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf" (OuterVolumeSpecName: "kube-api-access-rbnpf") pod "0a78868f-1786-430d-8df8-18bb1c2019b3" (UID: "0a78868f-1786-430d-8df8-18bb1c2019b3"). InnerVolumeSpecName "kube-api-access-rbnpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.682133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "183d86e9-cd5c-45ed-a460-bb6169e07c72" (UID: "183d86e9-cd5c-45ed-a460-bb6169e07c72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.685249 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f85f72a8-3887-4867-8a9c-649992ce23f1" (UID: "f85f72a8-3887-4867-8a9c-649992ce23f1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.699762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43acaee8-efc8-4156-b28c-b493f241ac53" (UID: "43acaee8-efc8-4156-b28c-b493f241ac53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.738856 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-z7ng7"] Mar 13 14:03:58 crc kubenswrapper[4898]: W0313 14:03:58.744801 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8942bb7_1cd2_49b9_8d98_5ba4c5f6c320.slice/crio-7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807 WatchSource:0}: Error finding container 7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807: Status 404 returned error can't find the container with id 7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745590 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745621 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745633 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745642 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbnpf\" (UniqueName: \"kubernetes.io/projected/0a78868f-1786-430d-8df8-18bb1c2019b3-kube-api-access-rbnpf\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745652 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f85f72a8-3887-4867-8a9c-649992ce23f1-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745660 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5s4l\" (UniqueName: \"kubernetes.io/projected/f85f72a8-3887-4867-8a9c-649992ce23f1-kube-api-access-m5s4l\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.745668 4898 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a78868f-1786-430d-8df8-18bb1c2019b3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746080 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnv8v\" (UniqueName: \"kubernetes.io/projected/183d86e9-cd5c-45ed-a460-bb6169e07c72-kube-api-access-vnv8v\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746352 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746365 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43acaee8-efc8-4156-b28c-b493f241ac53-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746375 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhlq2\" (UniqueName: \"kubernetes.io/projected/43acaee8-efc8-4156-b28c-b493f241ac53-kube-api-access-zhlq2\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.746383 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/183d86e9-cd5c-45ed-a460-bb6169e07c72-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840342 4898 generic.go:334] "Generic (PLEG): container finished" podID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h97c9" event={"ID":"f85f72a8-3887-4867-8a9c-649992ce23f1","Type":"ContainerDied","Data":"da11d51940a63fb9fd52ba5896a1fe2ba45d932b66b7a36000029b7816a483fc"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840466 4898 scope.go:117] "RemoveContainer" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.840828 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h97c9" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843708 4898 generic.go:334] "Generic (PLEG): container finished" podID="43acaee8-efc8-4156-b28c-b493f241ac53" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843775 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dvvz2" event={"ID":"43acaee8-efc8-4156-b28c-b493f241ac53","Type":"ContainerDied","Data":"6d70382f54646dad1c6a01020a09851e8f00eda076ad91d5aba2e586ae668444"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.843822 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dvvz2" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852654 4898 generic.go:334] "Generic (PLEG): container finished" podID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-974qp" event={"ID":"183d86e9-cd5c-45ed-a460-bb6169e07c72","Type":"ContainerDied","Data":"3597e8f057d81527e3da3a21c0723e2cabe95896c1c2879fe09fef6825e9aab7"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.852780 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-974qp" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.855475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" event={"ID":"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320","Type":"ContainerStarted","Data":"7e9d8f7e38cb2d318f8401c5b79ee3fe1e97c72f16f88351b75ec47ffe85f807"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859520 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerDied","Data":"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859601 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.859616 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p8r99" event={"ID":"0a78868f-1786-430d-8df8-18bb1c2019b3","Type":"ContainerDied","Data":"5ce4caec01bc9ee8df0b59f3f0251f9037b82e485a55597652071608caca296b"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.860952 4898 scope.go:117] "RemoveContainer" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865426 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" exitCode=0 Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865512 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twh8h" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865526 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.865966 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twh8h" event={"ID":"8f81bcfc-3c35-48e8-a584-961351e8c0e2","Type":"ContainerDied","Data":"81fb34feaf2adf00d5d07da217b484c8e9d6cdeb7a039901668613864eddf170"} Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.880713 4898 scope.go:117] "RemoveContainer" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.891098 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.901244 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dvvz2"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.914044 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.914557 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h97c9"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.918114 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.925030 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-974qp"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.931736 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.939619 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twh8h"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.940874 4898 scope.go:117] "RemoveContainer" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.941461 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8\": container with ID starting with cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 not found: ID does not exist" containerID="cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941494 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8"} err="failed to get container status \"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8\": rpc error: code = NotFound desc = could not find container \"cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8\": container with ID starting with cd39a5b62c38cd0e4291eb452dc66dbfa73d20085df0f04f24d87193b1c4faf8 not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941523 4898 scope.go:117] "RemoveContainer" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.941759 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d\": container with ID starting with 8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d not found: ID does not exist" containerID="8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941782 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d"} err="failed to get container status \"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d\": rpc error: code = NotFound desc = could not find container \"8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d\": container with ID starting with 8a033fa272e9b0ae10a8a39302b03fd524ffa35265e29f3d0f8c05e19edc4d0d not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.941799 4898 scope.go:117] "RemoveContainer" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.942061 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485\": container with ID starting with 6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485 not found: ID does not exist" containerID="6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.942101 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485"} err="failed to get container status \"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485\": rpc error: code = NotFound desc = could not find container \"6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485\": container with ID starting with 6f2184df8da4b3bf69f4145756b368ff2efd7bf87ea92af146fe995c57cb7485 not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.942120 4898 scope.go:117] "RemoveContainer" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.951109 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.955235 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p8r99"] Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.956290 4898 scope.go:117] "RemoveContainer" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.971514 4898 scope.go:117] "RemoveContainer" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.991415 4898 scope.go:117] "RemoveContainer" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.991884 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec\": container with ID starting with 8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec not found: ID does not exist" containerID="8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.991952 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec"} err="failed to get container status \"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec\": rpc error: code = NotFound desc = could not find container \"8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec\": container with ID starting with 8a40593eea81d6a95d388d6b35cd414db22d496cba0a3b511f6c3c4af3e4b8ec not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.991982 4898 scope.go:117] "RemoveContainer" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.992273 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b\": container with ID starting with 166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b not found: ID does not exist" containerID="166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992302 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b"} err="failed to get container status \"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b\": rpc error: code = NotFound desc = could not find container \"166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b\": container with ID starting with 166addc84a00d6cd28f5d4a11eaa406e638fc978d5ab44d8f9525754ee76c77b not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992322 4898 scope.go:117] "RemoveContainer" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" Mar 13 14:03:58 crc kubenswrapper[4898]: E0313 14:03:58.992566 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf\": container with ID starting with 2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf not found: ID does not exist" containerID="2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992589 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf"} err="failed to get container status \"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf\": rpc error: code = NotFound desc = could not find container \"2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf\": container with ID starting with 2d6a6fc4d86890be4033989a74b2cd86971250c91cd72a349673fbbc352230cf not found: ID does not exist" Mar 13 14:03:58 crc kubenswrapper[4898]: I0313 14:03:58.992604 4898 scope.go:117] "RemoveContainer" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.006448 4898 scope.go:117] "RemoveContainer" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.031327 4898 scope.go:117] "RemoveContainer" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.044517 4898 scope.go:117] "RemoveContainer" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.044882 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635\": container with ID starting with b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635 not found: ID does not exist" containerID="b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.044950 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635"} err="failed to get container status \"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635\": rpc error: code = NotFound desc = could not find container \"b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635\": container with ID starting with b28ca2f5572caf9aa06fca178d1a31d55764b021494704172c96d7af68b09635 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.044979 4898 scope.go:117] "RemoveContainer" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.045287 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47\": container with ID starting with 4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47 not found: ID does not exist" containerID="4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045320 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47"} err="failed to get container status \"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47\": rpc error: code = NotFound desc = could not find container \"4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47\": container with ID starting with 4e3c2fc49e38fd08a1405311e1eced3f11241c9df7c680ba46b64e3f946fea47 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045346 4898 scope.go:117] "RemoveContainer" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.045661 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248\": container with ID starting with 7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248 not found: ID does not exist" containerID="7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045683 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248"} err="failed to get container status \"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248\": rpc error: code = NotFound desc = could not find container \"7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248\": container with ID starting with 7888e10b86e2b6b4b1a521af790a300e700cb557e90ae4215808993511904248 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.045701 4898 scope.go:117] "RemoveContainer" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.057697 4898 scope.go:117] "RemoveContainer" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.072438 4898 scope.go:117] "RemoveContainer" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.072765 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24\": container with ID starting with fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24 not found: ID does not exist" containerID="fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.072797 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24"} err="failed to get container status \"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24\": rpc error: code = NotFound desc = could not find container \"fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24\": container with ID starting with fee68a52f02b5d009748f7d22e4efe8b71e5cbe2b4ec6b512eae62294cde6d24 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.072825 4898 scope.go:117] "RemoveContainer" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.073049 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67\": container with ID starting with 66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67 not found: ID does not exist" containerID="66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.073072 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67"} err="failed to get container status \"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67\": rpc error: code = NotFound desc = could not find container \"66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67\": container with ID starting with 66a8b7ab3a08de395e71354315a06c65dae8eb185d93ccdee2c35ad093ab2e67 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.073088 4898 scope.go:117] "RemoveContainer" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.121927 4898 scope.go:117] "RemoveContainer" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.134187 4898 scope.go:117] "RemoveContainer" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.150969 4898 scope.go:117] "RemoveContainer" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.151366 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c\": container with ID starting with c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c not found: ID does not exist" containerID="c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151403 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c"} err="failed to get container status \"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c\": rpc error: code = NotFound desc = could not find container \"c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c\": container with ID starting with c0d126f66fb80fd38ad4cce383bbe14103ead798e0605b7596e1d4e7e5d8dd4c not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151432 4898 scope.go:117] "RemoveContainer" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.151723 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336\": container with ID starting with 2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336 not found: ID does not exist" containerID="2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151749 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336"} err="failed to get container status \"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336\": rpc error: code = NotFound desc = could not find container \"2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336\": container with ID starting with 2aacfa448de7533468427cf155ae3ec5563cff1d3313d0b6259a3abd6879e336 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.151766 4898 scope.go:117] "RemoveContainer" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" Mar 13 14:03:59 crc kubenswrapper[4898]: E0313 14:03:59.152204 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380\": container with ID starting with af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380 not found: ID does not exist" containerID="af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.152230 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380"} err="failed to get container status \"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380\": rpc error: code = NotFound desc = could not find container \"af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380\": container with ID starting with af2b26d62c785f829d6f729b05b9a482b0b3ab930c91a34e55f9b679910cf380 not found: ID does not exist" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.751830 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" path="/var/lib/kubelet/pods/0a78868f-1786-430d-8df8-18bb1c2019b3/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.753868 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" path="/var/lib/kubelet/pods/183d86e9-cd5c-45ed-a460-bb6169e07c72/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.755654 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" path="/var/lib/kubelet/pods/43acaee8-efc8-4156-b28c-b493f241ac53/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.757788 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" path="/var/lib/kubelet/pods/8f81bcfc-3c35-48e8-a584-961351e8c0e2/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.759110 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" path="/var/lib/kubelet/pods/f85f72a8-3887-4867-8a9c-649992ce23f1/volumes" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.880320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" event={"ID":"b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320","Type":"ContainerStarted","Data":"eb2daa7a5834deb74ab00e016de96548278888d4dc2100cb3f45f5181eb8442b"} Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.880712 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.885079 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" Mar 13 14:03:59 crc kubenswrapper[4898]: I0313 14:03:59.896579 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podStartSLOduration=2.896560859 podStartE2EDuration="2.896560859s" podCreationTimestamp="2026-03-13 14:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:03:59.894625776 +0000 UTC m=+474.896214035" watchObservedRunningTime="2026-03-13 14:03:59.896560859 +0000 UTC m=+474.898149098" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128234 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hkbng"] Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128466 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128479 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128491 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128499 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128513 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128520 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128535 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128543 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128553 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128560 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128569 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128577 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128590 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128597 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128609 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128616 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128624 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128633 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128643 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128650 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128660 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128668 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="extract-utilities" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128677 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128685 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128696 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128704 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="extract-content" Mar 13 14:04:00 crc kubenswrapper[4898]: E0313 14:04:00.128713 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128721 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128821 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128831 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="183d86e9-cd5c-45ed-a460-bb6169e07c72" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128842 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85f72a8-3887-4867-8a9c-649992ce23f1" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128852 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f81bcfc-3c35-48e8-a584-961351e8c0e2" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128862 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="43acaee8-efc8-4156-b28c-b493f241ac53" containerName="registry-server" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.128875 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a78868f-1786-430d-8df8-18bb1c2019b3" containerName="marketplace-operator" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.129670 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.131875 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.134380 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.135747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.139344 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.139444 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.140065 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.145999 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkbng"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.153655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-utilities\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbt6s\" (UniqueName: \"kubernetes.io/projected/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-kube-api-access-jbt6s\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166627 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-catalog-content\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.166658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"auto-csr-approver-29556844-q7h28\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267340 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbt6s\" (UniqueName: \"kubernetes.io/projected/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-kube-api-access-jbt6s\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-catalog-content\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267435 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"auto-csr-approver-29556844-q7h28\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.267881 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-utilities\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.268371 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-catalog-content\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.268381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-utilities\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.287877 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"auto-csr-approver-29556844-q7h28\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.290761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbt6s\" (UniqueName: \"kubernetes.io/projected/89abe4ad-dd62-4a70-a1d1-fdf97448ada5-kube-api-access-jbt6s\") pod \"certified-operators-hkbng\" (UID: \"89abe4ad-dd62-4a70-a1d1-fdf97448ada5\") " pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.322551 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zs42q"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.323459 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.326242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.338465 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs42q"] Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.468990 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.470444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-catalog-content\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.470555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktr7f\" (UniqueName: \"kubernetes.io/projected/0182307e-bc7f-415e-a0f9-0eff9902384c-kube-api-access-ktr7f\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.470630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-utilities\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.475241 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktr7f\" (UniqueName: \"kubernetes.io/projected/0182307e-bc7f-415e-a0f9-0eff9902384c-kube-api-access-ktr7f\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-utilities\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-catalog-content\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.571979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-catalog-content\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.572349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0182307e-bc7f-415e-a0f9-0eff9902384c-utilities\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.597391 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktr7f\" (UniqueName: \"kubernetes.io/projected/0182307e-bc7f-415e-a0f9-0eff9902384c-kube-api-access-ktr7f\") pod \"redhat-marketplace-zs42q\" (UID: \"0182307e-bc7f-415e-a0f9-0eff9902384c\") " pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.648294 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.929299 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:04:00 crc kubenswrapper[4898]: W0313 14:04:00.935879 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd30282f_65c8_45d8_89f3_c6e2f16662d4.slice/crio-752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542 WatchSource:0}: Error finding container 752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542: Status 404 returned error can't find the container with id 752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542 Mar 13 14:04:00 crc kubenswrapper[4898]: I0313 14:04:00.980443 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hkbng"] Mar 13 14:04:00 crc kubenswrapper[4898]: W0313 14:04:00.982828 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89abe4ad_dd62_4a70_a1d1_fdf97448ada5.slice/crio-3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec WatchSource:0}: Error finding container 3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec: Status 404 returned error can't find the container with id 3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.067791 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zs42q"] Mar 13 14:04:01 crc kubenswrapper[4898]: W0313 14:04:01.067946 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0182307e_bc7f_415e_a0f9_0eff9902384c.slice/crio-129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d WatchSource:0}: Error finding container 129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d: Status 404 returned error can't find the container with id 129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.896644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556844-q7h28" event={"ID":"cd30282f-65c8-45d8-89f3-c6e2f16662d4","Type":"ContainerStarted","Data":"752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.898509 4898 generic.go:334] "Generic (PLEG): container finished" podID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerID="60bb08d44387c7849286247a4451083d86802422c22c69c6d73ce6c5a8459355" exitCode=0 Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.898598 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerDied","Data":"60bb08d44387c7849286247a4451083d86802422c22c69c6d73ce6c5a8459355"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.898629 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerStarted","Data":"3c34c6f1ba5d87a1c1ee78207d34fa953dad0e25b7fe710e85f481b7516936ec"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.903978 4898 generic.go:334] "Generic (PLEG): container finished" podID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerID="69485b46bd03239510d094d6d7c5c20008e9439faab5356d9bceb41ec96e8a78" exitCode=0 Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.904041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerDied","Data":"69485b46bd03239510d094d6d7c5c20008e9439faab5356d9bceb41ec96e8a78"} Mar 13 14:04:01 crc kubenswrapper[4898]: I0313 14:04:01.904108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerStarted","Data":"129becc9925ab90f29d2c0f42b27f347625f74a5adb7cf39ed0042115e4b864d"} Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.518581 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zgjzn"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.519763 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.524724 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.535690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgjzn"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.603281 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-utilities\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.603334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-catalog-content\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.603392 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdvtn\" (UniqueName: \"kubernetes.io/projected/cbb51f06-0778-4b18-82b5-c5ce91e0a613-kube-api-access-gdvtn\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.704129 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-utilities\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.704191 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-catalog-content\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.704245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdvtn\" (UniqueName: \"kubernetes.io/projected/cbb51f06-0778-4b18-82b5-c5ce91e0a613-kube-api-access-gdvtn\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.706652 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-utilities\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.706594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbb51f06-0778-4b18-82b5-c5ce91e0a613-catalog-content\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.728498 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.730364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.733721 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.737108 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdvtn\" (UniqueName: \"kubernetes.io/projected/cbb51f06-0778-4b18-82b5-c5ce91e0a613-kube-api-access-gdvtn\") pod \"redhat-operators-zgjzn\" (UID: \"cbb51f06-0778-4b18-82b5-c5ce91e0a613\") " pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.743364 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.806225 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.806295 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.806377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.838035 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.907402 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.907449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.907483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.912174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.912196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.932816 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerID="59d89d033eeab55992b3ec00208c3b5cec577e8de3d6a36471e4a08df49334b0" exitCode=0 Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.932884 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556844-q7h28" event={"ID":"cd30282f-65c8-45d8-89f3-c6e2f16662d4","Type":"ContainerDied","Data":"59d89d033eeab55992b3ec00208c3b5cec577e8de3d6a36471e4a08df49334b0"} Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.943845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerStarted","Data":"e7e84b8c62fe1cfd24a053035a9551f9c1c3bd118f328f277a4430d9e847e65e"} Mar 13 14:04:02 crc kubenswrapper[4898]: I0313 14:04:02.950396 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"community-operators-nf9mj\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.073830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.299129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zgjzn"] Mar 13 14:04:03 crc kubenswrapper[4898]: W0313 14:04:03.308948 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb51f06_0778_4b18_82b5_c5ce91e0a613.slice/crio-baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740 WatchSource:0}: Error finding container baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740: Status 404 returned error can't find the container with id baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.480640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:04:03 crc kubenswrapper[4898]: W0313 14:04:03.525770 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod112ac477_caf1_4778_9161_737e393633b6.slice/crio-cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f WatchSource:0}: Error finding container cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f: Status 404 returned error can't find the container with id cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.953303 4898 generic.go:334] "Generic (PLEG): container finished" podID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerID="f784e80c2b12666e1f35ec79baad6f9df6ad51fb651197a806468000ac6a2641" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.953415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerDied","Data":"f784e80c2b12666e1f35ec79baad6f9df6ad51fb651197a806468000ac6a2641"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.953743 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerStarted","Data":"baffb80482a04d378c0c02b555c50178f36cf386c59b550dce65769fa88f6740"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.956575 4898 generic.go:334] "Generic (PLEG): container finished" podID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerID="e7e84b8c62fe1cfd24a053035a9551f9c1c3bd118f328f277a4430d9e847e65e" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.957020 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerDied","Data":"e7e84b8c62fe1cfd24a053035a9551f9c1c3bd118f328f277a4430d9e847e65e"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.966225 4898 generic.go:334] "Generic (PLEG): container finished" podID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerID="3c62fb6c0c0a5d7610c81eb3bac9a0463b266c6fe606fcf0f6eaa3106384f862" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.966346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerDied","Data":"3c62fb6c0c0a5d7610c81eb3bac9a0463b266c6fe606fcf0f6eaa3106384f862"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.968844 4898 generic.go:334] "Generic (PLEG): container finished" podID="112ac477-caf1-4778-9161-737e393633b6" containerID="2316a3e4d2fc964fff3bdea961936abc15de57cb9446d0c3ca366fa8840b5460" exitCode=0 Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.968890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"2316a3e4d2fc964fff3bdea961936abc15de57cb9446d0c3ca366fa8840b5460"} Mar 13 14:04:03 crc kubenswrapper[4898]: I0313 14:04:03.968979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerStarted","Data":"cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.339576 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.535006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") pod \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\" (UID: \"cd30282f-65c8-45d8-89f3-c6e2f16662d4\") " Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.540517 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw" (OuterVolumeSpecName: "kube-api-access-dccvw") pod "cd30282f-65c8-45d8-89f3-c6e2f16662d4" (UID: "cd30282f-65c8-45d8-89f3-c6e2f16662d4"). InnerVolumeSpecName "kube-api-access-dccvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.637435 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dccvw\" (UniqueName: \"kubernetes.io/projected/cd30282f-65c8-45d8-89f3-c6e2f16662d4-kube-api-access-dccvw\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.980760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zs42q" event={"ID":"0182307e-bc7f-415e-a0f9-0eff9902384c","Type":"ContainerStarted","Data":"77d9766c8fecf5c86dafd5750df2ef49509322338b053c35ad79ac796cac5820"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.982934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556844-q7h28" event={"ID":"cd30282f-65c8-45d8-89f3-c6e2f16662d4","Type":"ContainerDied","Data":"752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.982984 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="752319ba8672dad76676eb802295413482abd90351347d3efb3c54acb0646542" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.983065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556844-q7h28" Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.987410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerStarted","Data":"2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.990408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerStarted","Data":"779306da39107e22f5c6e9064f03a6b5cd0fac4a5dac5550d00ae8a28dc13c8f"} Mar 13 14:04:04 crc kubenswrapper[4898]: I0313 14:04:04.992529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hkbng" event={"ID":"89abe4ad-dd62-4a70-a1d1-fdf97448ada5","Type":"ContainerStarted","Data":"7d332847a2571b0892a4d2d051dd5994998dc75188735a88d9bfbfb47991cdd2"} Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.001535 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zs42q" podStartSLOduration=2.375612707 podStartE2EDuration="5.001514509s" podCreationTimestamp="2026-03-13 14:04:00 +0000 UTC" firstStartedPulling="2026-03-13 14:04:01.906016087 +0000 UTC m=+476.907604336" lastFinishedPulling="2026-03-13 14:04:04.531917899 +0000 UTC m=+479.533506138" observedRunningTime="2026-03-13 14:04:04.9982953 +0000 UTC m=+479.999883579" watchObservedRunningTime="2026-03-13 14:04:05.001514509 +0000 UTC m=+480.003102748" Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.022021 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hkbng" podStartSLOduration=2.347983617 podStartE2EDuration="5.021995613s" podCreationTimestamp="2026-03-13 14:04:00 +0000 UTC" firstStartedPulling="2026-03-13 14:04:01.900522246 +0000 UTC m=+476.902110485" lastFinishedPulling="2026-03-13 14:04:04.574534242 +0000 UTC m=+479.576122481" observedRunningTime="2026-03-13 14:04:05.017041626 +0000 UTC m=+480.018629875" watchObservedRunningTime="2026-03-13 14:04:05.021995613 +0000 UTC m=+480.023583852" Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.407248 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.410527 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556838-h7pkr"] Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.579340 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" containerID="cri-o://f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" gracePeriod=30 Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.745909 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1ed4c8-e4bd-4352-bee3-404f16244ea3" path="/var/lib/kubelet/pods/aa1ed4c8-e4bd-4352-bee3-404f16244ea3/volumes" Mar 13 14:04:05 crc kubenswrapper[4898]: I0313 14:04:05.989150 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.000337 4898 generic.go:334] "Generic (PLEG): container finished" podID="112ac477-caf1-4778-9161-737e393633b6" containerID="2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5" exitCode=0 Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.000410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.000461 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerStarted","Data":"fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.002018 4898 generic.go:334] "Generic (PLEG): container finished" podID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerID="779306da39107e22f5c6e9064f03a6b5cd0fac4a5dac5550d00ae8a28dc13c8f" exitCode=0 Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.002075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerDied","Data":"779306da39107e22f5c6e9064f03a6b5cd0fac4a5dac5550d00ae8a28dc13c8f"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003188 4898 generic.go:334] "Generic (PLEG): container finished" podID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" exitCode=0 Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003224 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerDied","Data":"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6n228" event={"ID":"b08c305d-b9fc-4c5c-85c1-8281b9608bcf","Type":"ContainerDied","Data":"cb30f09f65c6668eae49d8e2a5f1518ff1c19e2eb8fcc21bf1f743165319e716"} Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.003280 4898 scope.go:117] "RemoveContainer" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.016930 4898 scope.go:117] "RemoveContainer" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" Mar 13 14:04:06 crc kubenswrapper[4898]: E0313 14:04:06.017450 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2\": container with ID starting with f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2 not found: ID does not exist" containerID="f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.017485 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2"} err="failed to get container status \"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2\": rpc error: code = NotFound desc = could not find container \"f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2\": container with ID starting with f91957737e0395e724d38a589c69e42d1186dc2a931994cc08b0d0b2fc46b2b2 not found: ID does not exist" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.034822 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nf9mj" podStartSLOduration=2.560395544 podStartE2EDuration="4.034806359s" podCreationTimestamp="2026-03-13 14:04:02 +0000 UTC" firstStartedPulling="2026-03-13 14:04:03.970938483 +0000 UTC m=+478.972526722" lastFinishedPulling="2026-03-13 14:04:05.445349298 +0000 UTC m=+480.446937537" observedRunningTime="2026-03-13 14:04:06.034230463 +0000 UTC m=+481.035818722" watchObservedRunningTime="2026-03-13 14:04:06.034806359 +0000 UTC m=+481.036394598" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154360 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154408 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154462 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154515 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.154564 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.155084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.155220 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.155400 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\" (UID: \"b08c305d-b9fc-4c5c-85c1-8281b9608bcf\") " Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.156159 4898 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.156727 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.166763 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.171097 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.171730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8" (OuterVolumeSpecName: "kube-api-access-xt5s8") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "kube-api-access-xt5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.172362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.178732 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.179966 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b08c305d-b9fc-4c5c-85c1-8281b9608bcf" (UID: "b08c305d-b9fc-4c5c-85c1-8281b9608bcf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257703 4898 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257744 4898 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257758 4898 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257770 4898 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.257783 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt5s8\" (UniqueName: \"kubernetes.io/projected/b08c305d-b9fc-4c5c-85c1-8281b9608bcf-kube-api-access-xt5s8\") on node \"crc\" DevicePath \"\"" Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.352446 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 14:04:06 crc kubenswrapper[4898]: I0313 14:04:06.356757 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6n228"] Mar 13 14:04:07 crc kubenswrapper[4898]: I0313 14:04:07.012795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zgjzn" event={"ID":"cbb51f06-0778-4b18-82b5-c5ce91e0a613","Type":"ContainerStarted","Data":"0bfe2329bd880746bbd5197bafddd64a03425f52194e76f04bc937b6e425402a"} Mar 13 14:04:07 crc kubenswrapper[4898]: I0313 14:04:07.036490 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zgjzn" podStartSLOduration=2.566317916 podStartE2EDuration="5.036471029s" podCreationTimestamp="2026-03-13 14:04:02 +0000 UTC" firstStartedPulling="2026-03-13 14:04:03.955626621 +0000 UTC m=+478.957214880" lastFinishedPulling="2026-03-13 14:04:06.425779754 +0000 UTC m=+481.427367993" observedRunningTime="2026-03-13 14:04:07.035362788 +0000 UTC m=+482.036951047" watchObservedRunningTime="2026-03-13 14:04:07.036471029 +0000 UTC m=+482.038059268" Mar 13 14:04:07 crc kubenswrapper[4898]: I0313 14:04:07.751224 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" path="/var/lib/kubelet/pods/b08c305d-b9fc-4c5c-85c1-8281b9608bcf/volumes" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.469303 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.469760 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.521310 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.649071 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.649143 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:10 crc kubenswrapper[4898]: I0313 14:04:10.685608 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:11 crc kubenswrapper[4898]: I0313 14:04:11.072689 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hkbng" Mar 13 14:04:11 crc kubenswrapper[4898]: I0313 14:04:11.082289 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zs42q" Mar 13 14:04:12 crc kubenswrapper[4898]: I0313 14:04:12.839341 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:12 crc kubenswrapper[4898]: I0313 14:04:12.839425 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.074174 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.074257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.112384 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:13 crc kubenswrapper[4898]: I0313 14:04:13.878602 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output=< Mar 13 14:04:13 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:04:13 crc kubenswrapper[4898]: > Mar 13 14:04:14 crc kubenswrapper[4898]: I0313 14:04:14.108830 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.134412 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.134733 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.134780 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.135364 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:04:19 crc kubenswrapper[4898]: I0313 14:04:19.135424 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52" gracePeriod=600 Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084578 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52" exitCode=0 Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52"} Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084863 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092"} Mar 13 14:04:20 crc kubenswrapper[4898]: I0313 14:04:20.084890 4898 scope.go:117] "RemoveContainer" containerID="8568b0d4122e606a851ae23a97395b29784dd29138349c86f59191196e7b0f56" Mar 13 14:04:22 crc kubenswrapper[4898]: I0313 14:04:22.907945 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:22 crc kubenswrapper[4898]: I0313 14:04:22.969123 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zgjzn" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.023388 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg"] Mar 13 14:04:52 crc kubenswrapper[4898]: E0313 14:04:52.024257 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerName="oc" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024278 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerName="oc" Mar 13 14:04:52 crc kubenswrapper[4898]: E0313 14:04:52.024293 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024304 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024480 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08c305d-b9fc-4c5c-85c1-8281b9608bcf" containerName="registry" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.024502 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" containerName="oc" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.025090 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.027992 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.028945 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.030409 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.030798 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.038939 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.052690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg"] Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.101958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2220cab0-84f3-4922-b02c-5d8f12977964-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.102009 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2220cab0-84f3-4922-b02c-5d8f12977964-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.102120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpgbv\" (UniqueName: \"kubernetes.io/projected/2220cab0-84f3-4922-b02c-5d8f12977964-kube-api-access-mpgbv\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.203092 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpgbv\" (UniqueName: \"kubernetes.io/projected/2220cab0-84f3-4922-b02c-5d8f12977964-kube-api-access-mpgbv\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.203166 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2220cab0-84f3-4922-b02c-5d8f12977964-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.203201 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2220cab0-84f3-4922-b02c-5d8f12977964-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.205323 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/2220cab0-84f3-4922-b02c-5d8f12977964-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.212122 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/2220cab0-84f3-4922-b02c-5d8f12977964-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.227044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpgbv\" (UniqueName: \"kubernetes.io/projected/2220cab0-84f3-4922-b02c-5d8f12977964-kube-api-access-mpgbv\") pod \"cluster-monitoring-operator-6d5b84845-zxddg\" (UID: \"2220cab0-84f3-4922-b02c-5d8f12977964\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.364035 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.710814 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg"] Mar 13 14:04:52 crc kubenswrapper[4898]: I0313 14:04:52.721966 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:04:53 crc kubenswrapper[4898]: I0313 14:04:53.269215 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" event={"ID":"2220cab0-84f3-4922-b02c-5d8f12977964","Type":"ContainerStarted","Data":"b7f230cc972a4c7d51b20e2fd5028954f8c1a42b11ede3bd192ab358d61ebcd3"} Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.002215 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt"] Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.003086 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.005437 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.006005 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-r4ldw" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.016749 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt"] Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.137986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/802396a8-633d-4f86-b77b-c25e9c76cc7a-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4p7wt\" (UID: \"802396a8-633d-4f86-b77b-c25e9c76cc7a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.239365 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/802396a8-633d-4f86-b77b-c25e9c76cc7a-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4p7wt\" (UID: \"802396a8-633d-4f86-b77b-c25e9c76cc7a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.244168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/802396a8-633d-4f86-b77b-c25e9c76cc7a-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4p7wt\" (UID: \"802396a8-633d-4f86-b77b-c25e9c76cc7a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.280150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" event={"ID":"2220cab0-84f3-4922-b02c-5d8f12977964","Type":"ContainerStarted","Data":"b188a8d6bb2352db628138d15008095ba61d97259fa5815b5f6bd57c86bea0b0"} Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.296265 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-zxddg" podStartSLOduration=2.609727623 podStartE2EDuration="4.296251043s" podCreationTimestamp="2026-03-13 14:04:51 +0000 UTC" firstStartedPulling="2026-03-13 14:04:52.721671384 +0000 UTC m=+527.723259643" lastFinishedPulling="2026-03-13 14:04:54.408194814 +0000 UTC m=+529.409783063" observedRunningTime="2026-03-13 14:04:55.294368304 +0000 UTC m=+530.295956553" watchObservedRunningTime="2026-03-13 14:04:55.296251043 +0000 UTC m=+530.297839282" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.316136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:55 crc kubenswrapper[4898]: I0313 14:04:55.521683 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt"] Mar 13 14:04:56 crc kubenswrapper[4898]: I0313 14:04:56.294838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" event={"ID":"802396a8-633d-4f86-b77b-c25e9c76cc7a","Type":"ContainerStarted","Data":"9c6bbceaced98e6e3b8200c68801f3fceccd91684cb9fbe4d7870b1d45ee089b"} Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.303180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" event={"ID":"802396a8-633d-4f86-b77b-c25e9c76cc7a","Type":"ContainerStarted","Data":"67fef4a54132e1a45429506eabb64bc2f0135c568ba56a79d6372049a23edbc8"} Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.303645 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.314710 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" Mar 13 14:04:57 crc kubenswrapper[4898]: I0313 14:04:57.324425 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podStartSLOduration=2.013533207 podStartE2EDuration="3.324401316s" podCreationTimestamp="2026-03-13 14:04:54 +0000 UTC" firstStartedPulling="2026-03-13 14:04:55.531158458 +0000 UTC m=+530.532746697" lastFinishedPulling="2026-03-13 14:04:56.842026567 +0000 UTC m=+531.843614806" observedRunningTime="2026-03-13 14:04:57.323957954 +0000 UTC m=+532.325546223" watchObservedRunningTime="2026-03-13 14:04:57.324401316 +0000 UTC m=+532.325989595" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.083779 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-tqhcl"] Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.085854 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.090355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-q4z7r" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.090655 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.090808 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.091549 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.092666 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-tqhcl"] Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7b49\" (UniqueName: \"kubernetes.io/projected/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-kube-api-access-h7b49\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183754 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.183787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.285072 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7b49\" (UniqueName: \"kubernetes.io/projected/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-kube-api-access-h7b49\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.285547 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.285798 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.286065 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.286955 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-metrics-client-ca\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.292238 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.296653 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.307655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7b49\" (UniqueName: \"kubernetes.io/projected/64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b-kube-api-access-h7b49\") pod \"prometheus-operator-db54df47d-tqhcl\" (UID: \"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b\") " pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.402060 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" Mar 13 14:04:58 crc kubenswrapper[4898]: I0313 14:04:58.842074 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-tqhcl"] Mar 13 14:04:58 crc kubenswrapper[4898]: W0313 14:04:58.848415 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64542ec8_7d20_45ec_8e4f_8f5adcfb2c2b.slice/crio-f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c WatchSource:0}: Error finding container f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c: Status 404 returned error can't find the container with id f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c Mar 13 14:04:59 crc kubenswrapper[4898]: I0313 14:04:59.318754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" event={"ID":"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b","Type":"ContainerStarted","Data":"f08a4b189e50f165c2ac13cee27546aac8fe8a1d5a6f40a3aec2567da98e4d3c"} Mar 13 14:05:01 crc kubenswrapper[4898]: I0313 14:05:01.335416 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" event={"ID":"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b","Type":"ContainerStarted","Data":"db2a83a4c10efc2d1a302b2a613742214dea9a2f0f4a6b9987aca613fdbaad98"} Mar 13 14:05:01 crc kubenswrapper[4898]: I0313 14:05:01.338201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" event={"ID":"64542ec8-7d20-45ec-8e4f-8f5adcfb2c2b","Type":"ContainerStarted","Data":"aeba8c5df189e23876e9ca8668ce79efbe2d30b2a56230f3ccaf934da7a40cba"} Mar 13 14:05:01 crc kubenswrapper[4898]: I0313 14:05:01.355953 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-tqhcl" podStartSLOduration=1.8778979 podStartE2EDuration="3.35592583s" podCreationTimestamp="2026-03-13 14:04:58 +0000 UTC" firstStartedPulling="2026-03-13 14:04:58.851113039 +0000 UTC m=+533.852701288" lastFinishedPulling="2026-03-13 14:05:00.329140969 +0000 UTC m=+535.330729218" observedRunningTime="2026-03-13 14:05:01.354832982 +0000 UTC m=+536.356421241" watchObservedRunningTime="2026-03-13 14:05:01.35592583 +0000 UTC m=+536.357514109" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.446510 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.448106 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.450168 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.451083 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.451429 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-xnn9s" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.465960 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.468982 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.469920 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.472776 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.473497 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.473628 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-zk4th" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.474345 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478181 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478289 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478312 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkj4b\" (UniqueName: \"kubernetes.io/projected/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-api-access-wkj4b\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478330 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05679ba1-ef84-46c5-803d-22379bb824dd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478373 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.478425 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcd22\" (UniqueName: \"kubernetes.io/projected/05679ba1-ef84-46c5-803d-22379bb824dd-kube-api-access-tcd22\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.490891 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h4spr"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.491823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.496212 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.496608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.500077 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-xx4nn" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.508550 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f"] Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579461 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3220aa6-97e3-4ea7-8959-fd0d11002f32-metrics-client-ca\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579580 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkj4b\" (UniqueName: \"kubernetes.io/projected/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-api-access-wkj4b\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05679ba1-ef84-46c5-803d-22379bb824dd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-root\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579714 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klp29\" (UniqueName: \"kubernetes.io/projected/d3220aa6-97e3-4ea7-8959-fd0d11002f32-kube-api-access-klp29\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-sys\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579803 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcd22\" (UniqueName: \"kubernetes.io/projected/05679ba1-ef84-46c5-803d-22379bb824dd-kube-api-access-tcd22\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579831 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-textfile\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579862 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-tls\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.579929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-wtmp\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.580046 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: E0313 14:05:03.580140 4898 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 13 14:05:03 crc kubenswrapper[4898]: E0313 14:05:03.580210 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls podName:063cd9dd-e128-4dd5-af7b-a3b79b93c61a nodeName:}" failed. No retries permitted until 2026-03-13 14:05:04.080187546 +0000 UTC m=+539.081775785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-qnh2f" (UID: "063cd9dd-e128-4dd5-af7b-a3b79b93c61a") : secret "kube-state-metrics-tls" not found Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.581013 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.581019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.581027 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05679ba1-ef84-46c5-803d-22379bb824dd-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.587808 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.591546 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/05679ba1-ef84-46c5-803d-22379bb824dd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.594355 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.601219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkj4b\" (UniqueName: \"kubernetes.io/projected/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-api-access-wkj4b\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.603495 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcd22\" (UniqueName: \"kubernetes.io/projected/05679ba1-ef84-46c5-803d-22379bb824dd-kube-api-access-tcd22\") pod \"openshift-state-metrics-566fddb674-q6xsd\" (UID: \"05679ba1-ef84-46c5-803d-22379bb824dd\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3220aa6-97e3-4ea7-8959-fd0d11002f32-metrics-client-ca\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681425 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-root\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681452 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klp29\" (UniqueName: \"kubernetes.io/projected/d3220aa6-97e3-4ea7-8959-fd0d11002f32-kube-api-access-klp29\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681492 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-sys\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681511 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-textfile\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681536 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-tls\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-wtmp\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681680 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-root\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681760 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-wtmp\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.681813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d3220aa6-97e3-4ea7-8959-fd0d11002f32-sys\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.682034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-textfile\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.682095 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d3220aa6-97e3-4ea7-8959-fd0d11002f32-metrics-client-ca\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.684337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-tls\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.698258 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klp29\" (UniqueName: \"kubernetes.io/projected/d3220aa6-97e3-4ea7-8959-fd0d11002f32-kube-api-access-klp29\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.699584 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d3220aa6-97e3-4ea7-8959-fd0d11002f32-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4spr\" (UID: \"d3220aa6-97e3-4ea7-8959-fd0d11002f32\") " pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.769051 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" Mar 13 14:05:03 crc kubenswrapper[4898]: I0313 14:05:03.805410 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h4spr" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.085259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.090472 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/063cd9dd-e128-4dd5-af7b-a3b79b93c61a-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-qnh2f\" (UID: \"063cd9dd-e128-4dd5-af7b-a3b79b93c61a\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.151348 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd"] Mar 13 14:05:04 crc kubenswrapper[4898]: W0313 14:05:04.157408 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05679ba1_ef84_46c5_803d_22379bb824dd.slice/crio-2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e WatchSource:0}: Error finding container 2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e: Status 404 returned error can't find the container with id 2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.351460 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"21c8b6e4b700e648f1fdd42150db9b3759c42657a6c27a2795f3cd6529da1c09"} Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.352921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"2586619c7d0dd6fbc1322feadf7055f3590cb5b1d46371a9eb057bf0a29befc8"} Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.352963 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"2f52a456328bec51af8db2165a61503468dfc52b489c55e0b7e01be76c95eb2e"} Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.383456 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.561089 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.562815 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.565815 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.566910 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.567101 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.567669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-out\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593127 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593141 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593205 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tknc\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-kube-api-access-7tknc\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-web-config\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.593256 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.600383 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.601316 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.602911 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-gkpzt" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.603061 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.620001 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.623295 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.694991 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tknc\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-kube-api-access-7tknc\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695045 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-web-config\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695109 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695169 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695190 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-out\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.695321 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.696573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.697060 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.697275 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.701570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-web-config\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.701624 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.701726 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.702114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.702959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.703674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-config-out\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.721875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.731458 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.732130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tknc\" (UniqueName: \"kubernetes.io/projected/7b661f3a-62af-4aba-b8b3-e73b32d3da2d-kube-api-access-7tknc\") pod \"alertmanager-main-0\" (UID: \"7b661f3a-62af-4aba-b8b3-e73b32d3da2d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.824176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f"] Mar 13 14:05:04 crc kubenswrapper[4898]: I0313 14:05:04.929389 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.358083 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"75ab1ac97a281c9bd3ebcc8290291a51649968261363be4992a9a961f9786e53"} Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.360015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"25b29f01adfa51a37e38274268b70502cdbef39be471bd5cb6aa12fb2ed45f71"} Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.361875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"59238ce32ef31532c8d7e5cf21f8a8d4e41be0cc85ab7ad230443bcd6792c483"} Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.469055 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.514019 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp"] Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.516164 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.521143 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.521631 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.522943 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6939h072j9qpn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.522952 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-hpxqn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.526375 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.526604 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.531533 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.531989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp"] Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609072 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609097 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-metrics-client-ca\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609135 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-grpc-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609205 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmhr\" (UniqueName: \"kubernetes.io/projected/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-kube-api-access-stmhr\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609233 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.609265 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710499 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmhr\" (UniqueName: \"kubernetes.io/projected/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-kube-api-access-stmhr\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710617 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.710635 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711267 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-metrics-client-ca\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.711325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-grpc-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712346 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-metrics-client-ca\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.712740 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.713141 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.713342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-6939h072j9qpn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.713376 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.726961 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.727347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.734100 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-grpc-tls\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.734460 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.734586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.735510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.741781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmhr\" (UniqueName: \"kubernetes.io/projected/05b901e7-b9fc-4403-bcc2-8eeb2731c66f-kube-api-access-stmhr\") pod \"thanos-querier-7467c7fcf7-hsxhp\" (UID: \"05b901e7-b9fc-4403-bcc2-8eeb2731c66f\") " pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.848190 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-hpxqn" Mar 13 14:05:05 crc kubenswrapper[4898]: I0313 14:05:05.856886 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.370472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" event={"ID":"05679ba1-ef84-46c5-803d-22379bb824dd","Type":"ContainerStarted","Data":"1abd3ecd125e08e504ab669fb6513e625bd9e4c7c236caa9ce8c1e0c8c7c46f3"} Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.373157 4898 generic.go:334] "Generic (PLEG): container finished" podID="d3220aa6-97e3-4ea7-8959-fd0d11002f32" containerID="25b29f01adfa51a37e38274268b70502cdbef39be471bd5cb6aa12fb2ed45f71" exitCode=0 Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.373231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerDied","Data":"25b29f01adfa51a37e38274268b70502cdbef39be471bd5cb6aa12fb2ed45f71"} Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.375344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"44449a25c83109f4bd64c058609de98b6c854b7d955da87087986a77139e839c"} Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.401326 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-q6xsd" podStartSLOduration=2.023513292 podStartE2EDuration="3.401307515s" podCreationTimestamp="2026-03-13 14:05:03 +0000 UTC" firstStartedPulling="2026-03-13 14:05:04.38487198 +0000 UTC m=+539.386460219" lastFinishedPulling="2026-03-13 14:05:05.762666203 +0000 UTC m=+540.764254442" observedRunningTime="2026-03-13 14:05:06.393784677 +0000 UTC m=+541.395372926" watchObservedRunningTime="2026-03-13 14:05:06.401307515 +0000 UTC m=+541.402895754" Mar 13 14:05:06 crc kubenswrapper[4898]: I0313 14:05:06.821519 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp"] Mar 13 14:05:07 crc kubenswrapper[4898]: I0313 14:05:07.383892 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"05f0ba9a6501a12eb8926db452c6f724fbe7fef7b0c2645a67533c1d1046786a"} Mar 13 14:05:07 crc kubenswrapper[4898]: W0313 14:05:07.509421 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05b901e7_b9fc_4403_bcc2_8eeb2731c66f.slice/crio-12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898 WatchSource:0}: Error finding container 12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898: Status 404 returned error can't find the container with id 12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898 Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.259204 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.260462 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.274753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350385 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350507 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350568 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350589 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.350643 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.391994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4spr" event={"ID":"d3220aa6-97e3-4ea7-8959-fd0d11002f32","Type":"ContainerStarted","Data":"50c79a9d2b1c0f1e8be47aff36b9b1de95ecd73a11e9868c136f82fd89dbd95e"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.393228 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"12ab36caa7395831a23950389b9e637e57a337dae672f023cd9cda5e86131898"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.395454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"92fe7f4e343c990e5dc068a2e3be1a740ffd10d64c2fff90fe3a3d5fd69afbca"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.395492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"c65e95ec1ee4a5f8e453b99eca36b2d1d4006376063fd026a37c70230a16e549"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.395501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" event={"ID":"063cd9dd-e128-4dd5-af7b-a3b79b93c61a","Type":"ContainerStarted","Data":"649d5b47768b12df54e2db464a178d257ba693270f5c4a3a9639d65496664fd1"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.403995 4898 generic.go:334] "Generic (PLEG): container finished" podID="7b661f3a-62af-4aba-b8b3-e73b32d3da2d" containerID="dbd117fda1cffb0b08da30191bbf5415043dc7562e59f0b60385d92842e64d53" exitCode=0 Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.404058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerDied","Data":"dbd117fda1cffb0b08da30191bbf5415043dc7562e59f0b60385d92842e64d53"} Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.413750 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h4spr" podStartSLOduration=4.138837281 podStartE2EDuration="5.413725534s" podCreationTimestamp="2026-03-13 14:05:03 +0000 UTC" firstStartedPulling="2026-03-13 14:05:03.846746124 +0000 UTC m=+538.848334363" lastFinishedPulling="2026-03-13 14:05:05.121634377 +0000 UTC m=+540.123222616" observedRunningTime="2026-03-13 14:05:08.407184062 +0000 UTC m=+543.408772311" watchObservedRunningTime="2026-03-13 14:05:08.413725534 +0000 UTC m=+543.415313773" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.434154 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-qnh2f" podStartSLOduration=2.751333705 podStartE2EDuration="5.434136892s" podCreationTimestamp="2026-03-13 14:05:03 +0000 UTC" firstStartedPulling="2026-03-13 14:05:04.832620508 +0000 UTC m=+539.834208747" lastFinishedPulling="2026-03-13 14:05:07.515423695 +0000 UTC m=+542.517011934" observedRunningTime="2026-03-13 14:05:08.4238026 +0000 UTC m=+543.425390859" watchObservedRunningTime="2026-03-13 14:05:08.434136892 +0000 UTC m=+543.435725131" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452458 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452495 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452584 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.452666 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.453990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.454028 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.454821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.455262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.459035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.468674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.490821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"console-875645f9-l5trk\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.579045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.757350 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.758116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760652 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760771 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760789 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.760882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-r5574" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.761208 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-412985lg3l7cn" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.761300 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.762036 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856274 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/823ccfb8-89eb-409e-9c6c-579bacb35ea1-audit-log\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-server-tls\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856406 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-client-certs\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-metrics-server-audit-profiles\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmjw\" (UniqueName: \"kubernetes.io/projected/823ccfb8-89eb-409e-9c6c-579bacb35ea1-kube-api-access-hrmjw\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.856555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-client-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.951090 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/823ccfb8-89eb-409e-9c6c-579bacb35ea1-audit-log\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-server-tls\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-client-certs\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-metrics-server-audit-profiles\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmjw\" (UniqueName: \"kubernetes.io/projected/823ccfb8-89eb-409e-9c6c-579bacb35ea1-kube-api-access-hrmjw\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.958750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-client-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.959486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.959835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/823ccfb8-89eb-409e-9c6c-579bacb35ea1-audit-log\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.961320 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/823ccfb8-89eb-409e-9c6c-579bacb35ea1-metrics-server-audit-profiles\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.965029 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-client-ca-bundle\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.965715 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-client-certs\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.965805 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/823ccfb8-89eb-409e-9c6c-579bacb35ea1-secret-metrics-server-tls\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:08 crc kubenswrapper[4898]: I0313 14:05:08.975610 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmjw\" (UniqueName: \"kubernetes.io/projected/823ccfb8-89eb-409e-9c6c-579bacb35ea1-kube-api-access-hrmjw\") pod \"metrics-server-7b77fdd7dd-vwwfr\" (UID: \"823ccfb8-89eb-409e-9c6c-579bacb35ea1\") " pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.081237 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.291623 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-595dc77696-pft4c"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.292472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.294099 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.294506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.305473 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-595dc77696-pft4c"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.365753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10c7ab08-2341-4e85-ad67-8495e038afa2-monitoring-plugin-cert\") pod \"monitoring-plugin-595dc77696-pft4c\" (UID: \"10c7ab08-2341-4e85-ad67-8495e038afa2\") " pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.467034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10c7ab08-2341-4e85-ad67-8495e038afa2-monitoring-plugin-cert\") pod \"monitoring-plugin-595dc77696-pft4c\" (UID: \"10c7ab08-2341-4e85-ad67-8495e038afa2\") " pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.471347 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10c7ab08-2341-4e85-ad67-8495e038afa2-monitoring-plugin-cert\") pod \"monitoring-plugin-595dc77696-pft4c\" (UID: \"10c7ab08-2341-4e85-ad67-8495e038afa2\") " pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.616961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.718751 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.720740 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726313 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726380 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-bje79e1t761so" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.726741 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.727505 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.727659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.728052 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.729508 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-q952q" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.730550 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.730724 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.732934 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.736093 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.740089 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.762243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-web-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771362 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771423 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771497 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config-out\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771587 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmss4\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-kube-api-access-mmss4\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771628 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771657 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771695 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.771739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872611 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872713 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872760 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872844 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config-out\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmss4\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-kube-api-access-mmss4\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872931 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.872982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.873022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.873047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.873068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-web-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.874507 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.881284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.881371 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.881552 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.886331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.891910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895561 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.895881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-web-config\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896175 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896309 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896471 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.896490 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.899114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1b691eb6-70f2-4fce-b18a-1d7712fddcac-config-out\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.899428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmss4\" (UniqueName: \"kubernetes.io/projected/1b691eb6-70f2-4fce-b18a-1d7712fddcac-kube-api-access-mmss4\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.899832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1b691eb6-70f2-4fce-b18a-1d7712fddcac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:09 crc kubenswrapper[4898]: I0313 14:05:09.903578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1b691eb6-70f2-4fce-b18a-1d7712fddcac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1b691eb6-70f2-4fce-b18a-1d7712fddcac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.039484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.416018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerStarted","Data":"a20af1488f994221a262c4ea1f9370dd06f8b682113715c773da210306924f29"} Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.917743 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-595dc77696-pft4c"] Mar 13 14:05:10 crc kubenswrapper[4898]: W0313 14:05:10.924870 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c7ab08_2341_4e85_ad67_8495e038afa2.slice/crio-62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4 WatchSource:0}: Error finding container 62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4: Status 404 returned error can't find the container with id 62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4 Mar 13 14:05:10 crc kubenswrapper[4898]: I0313 14:05:10.983720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr"] Mar 13 14:05:10 crc kubenswrapper[4898]: W0313 14:05:10.985196 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod823ccfb8_89eb_409e_9c6c_579bacb35ea1.slice/crio-5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0 WatchSource:0}: Error finding container 5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0: Status 404 returned error can't find the container with id 5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0 Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.003671 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 14:05:11 crc kubenswrapper[4898]: W0313 14:05:11.009764 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b691eb6_70f2_4fce_b18a_1d7712fddcac.slice/crio-fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3 WatchSource:0}: Error finding container fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3: Status 404 returned error can't find the container with id fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3 Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.429801 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"2354e03dcd58942ac6f93cec12224a2608329553c2b2c49b8f91b03bc615fdbe"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.429862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"e5a2b5f33a3f742aa6d7bab79a5aa9b38ee1d6d22ff47e353e2a8fb4d1aace32"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.429882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"8185fba5582cf9528e5bb879c85b04febd0357bd5767c9de1b452c97fe9bed77"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.431042 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerStarted","Data":"5e57710f6a55b2fa278ab74e2f83d33634aa2d785f588334ad38b0e4d9e493b0"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.432693 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerID="b90c3c464c21a8906faeff48a497518f1d7a734ad42f83941bb7d88e97c809e2" exitCode=0 Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.432785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerDied","Data":"b90c3c464c21a8906faeff48a497518f1d7a734ad42f83941bb7d88e97c809e2"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.432814 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"fdbab1687a275cab6cb662d706cffe7544a58fc3fce7b082056bc90b056297a3"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.438644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"daed6cdf9dd863bd1d3b98928facf8c9e86c39b1f5150904deeaed4090d9f3e4"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.438816 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"60a2fd79d3195519ab633d3a18a4153d52732eda93d09a994a6ee8458c9762e0"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.438978 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"9ada23225913ee8e09566c2ff4c3b32ea83ff60825c4cbf648ccd7132d877700"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.439849 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"91f77999fa41e31ae8d1ff8333545f16403f40662037b6574a30f4fad2c0d692"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.440000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"bc05756d607a017a68de11fdeaf501352b3479c54a1c0a39cdf9f5fdc1fce145"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.442749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" event={"ID":"10c7ab08-2341-4e85-ad67-8495e038afa2","Type":"ContainerStarted","Data":"62604b3b51de429492ff009f68bec3915d8f3984dc1a0a393b90594d41dfa2e4"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.447167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerStarted","Data":"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575"} Mar 13 14:05:11 crc kubenswrapper[4898]: I0313 14:05:11.507002 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-875645f9-l5trk" podStartSLOduration=3.5069833470000003 podStartE2EDuration="3.506983347s" podCreationTimestamp="2026-03-13 14:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:05:11.502619662 +0000 UTC m=+546.504207901" watchObservedRunningTime="2026-03-13 14:05:11.506983347 +0000 UTC m=+546.508571596" Mar 13 14:05:12 crc kubenswrapper[4898]: I0313 14:05:12.458517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b661f3a-62af-4aba-b8b3-e73b32d3da2d","Type":"ContainerStarted","Data":"dc23bdd0fbdab75d6e63ce7f7091dcecdeb9c29ac1cf9f0f6e51ce70f5a235a1"} Mar 13 14:05:12 crc kubenswrapper[4898]: I0313 14:05:12.464356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"c65dc9b0d95e85171bc80fa0627031c341d8ffdf6c67b5d8c11c5a260352e7e4"} Mar 13 14:05:12 crc kubenswrapper[4898]: I0313 14:05:12.498155 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.275682339 podStartE2EDuration="8.498133321s" podCreationTimestamp="2026-03-13 14:05:04 +0000 UTC" firstStartedPulling="2026-03-13 14:05:05.728464872 +0000 UTC m=+540.730053131" lastFinishedPulling="2026-03-13 14:05:11.950915854 +0000 UTC m=+546.952504113" observedRunningTime="2026-03-13 14:05:12.490193372 +0000 UTC m=+547.491781621" watchObservedRunningTime="2026-03-13 14:05:12.498133321 +0000 UTC m=+547.499721560" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.471172 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerStarted","Data":"7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.474675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" event={"ID":"10c7ab08-2341-4e85-ad67-8495e038afa2","Type":"ContainerStarted","Data":"3e8dba0dbf5089e4ed621f82ae8e6f8700c7b6faa9bab00d5ecd90cc41243753"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.474876 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.477457 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"7da0f284cffc4722d8a79cecda6ca0f2acc68edb2d757e6ca774bd68cf6bc831"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.477485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" event={"ID":"05b901e7-b9fc-4403-bcc2-8eeb2731c66f","Type":"ContainerStarted","Data":"933b9b5cdfa995bc5a46b6a48dccbe6d92620d14da5454b07037495ad604a71c"} Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.477840 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.482431 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.492652 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podStartSLOduration=3.488273644 podStartE2EDuration="5.492627211s" podCreationTimestamp="2026-03-13 14:05:08 +0000 UTC" firstStartedPulling="2026-03-13 14:05:10.987376068 +0000 UTC m=+545.988964307" lastFinishedPulling="2026-03-13 14:05:12.991729605 +0000 UTC m=+547.993317874" observedRunningTime="2026-03-13 14:05:13.486799818 +0000 UTC m=+548.488388097" watchObservedRunningTime="2026-03-13 14:05:13.492627211 +0000 UTC m=+548.494215450" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.507461 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podStartSLOduration=2.444549513 podStartE2EDuration="4.507442141s" podCreationTimestamp="2026-03-13 14:05:09 +0000 UTC" firstStartedPulling="2026-03-13 14:05:10.927663516 +0000 UTC m=+545.929251755" lastFinishedPulling="2026-03-13 14:05:12.990556144 +0000 UTC m=+547.992144383" observedRunningTime="2026-03-13 14:05:13.501764822 +0000 UTC m=+548.503353091" watchObservedRunningTime="2026-03-13 14:05:13.507442141 +0000 UTC m=+548.509030390" Mar 13 14:05:13 crc kubenswrapper[4898]: I0313 14:05:13.538079 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podStartSLOduration=4.10000212 podStartE2EDuration="8.538058297s" podCreationTimestamp="2026-03-13 14:05:05 +0000 UTC" firstStartedPulling="2026-03-13 14:05:07.511859391 +0000 UTC m=+542.513447630" lastFinishedPulling="2026-03-13 14:05:11.949915568 +0000 UTC m=+546.951503807" observedRunningTime="2026-03-13 14:05:13.535623193 +0000 UTC m=+548.537211442" watchObservedRunningTime="2026-03-13 14:05:13.538058297 +0000 UTC m=+548.539646546" Mar 13 14:05:15 crc kubenswrapper[4898]: I0313 14:05:15.875764 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"4619c06b3bd1e0f3a3a4c4b63644f93b8683cdc6ac65f1b6cf7811be33750093"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504753 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"926ae0fab9f011fca5b7f408eeceb6d4898d2953e99a3f2832173a81b2a65936"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504772 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"99ab08618d4bde097a16a7acf871a9e1d7b1680c779f6ce94a962af2fd6e6422"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"ab8d8a67f9595d2bac345225aeb8ba86283c0942913cc66f6c73cce5aec24cbc"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"44b263eb743483388a1167571e8517c9e29fe86f92b9750ec93ac830fad9f8bb"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.504808 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1b691eb6-70f2-4fce-b18a-1d7712fddcac","Type":"ContainerStarted","Data":"ab590589ca17f1a027dfd0298e28f244a54c674de69f470ae27e2318f3e3e907"} Mar 13 14:05:16 crc kubenswrapper[4898]: I0313 14:05:16.556258 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.452908459 podStartE2EDuration="7.556230474s" podCreationTimestamp="2026-03-13 14:05:09 +0000 UTC" firstStartedPulling="2026-03-13 14:05:11.435456674 +0000 UTC m=+546.437044953" lastFinishedPulling="2026-03-13 14:05:15.538778729 +0000 UTC m=+550.540366968" observedRunningTime="2026-03-13 14:05:16.553453821 +0000 UTC m=+551.555042150" watchObservedRunningTime="2026-03-13 14:05:16.556230474 +0000 UTC m=+551.557818753" Mar 13 14:05:18 crc kubenswrapper[4898]: I0313 14:05:18.579946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:18 crc kubenswrapper[4898]: I0313 14:05:18.580505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:18 crc kubenswrapper[4898]: I0313 14:05:18.588143 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:19 crc kubenswrapper[4898]: I0313 14:05:19.536604 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:05:19 crc kubenswrapper[4898]: I0313 14:05:19.602645 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 14:05:20 crc kubenswrapper[4898]: I0313 14:05:20.040604 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:05:29 crc kubenswrapper[4898]: I0313 14:05:29.081727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:29 crc kubenswrapper[4898]: I0313 14:05:29.084852 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:44 crc kubenswrapper[4898]: I0313 14:05:44.655205 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-7l2pm" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" containerID="cri-o://5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae" gracePeriod=15 Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:45.742484 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7l2pm_0ea2e803-34d0-429b-b943-ece0b9e38b63/console/0.log" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:45.742862 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerID="5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae" exitCode=2 Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:45.751454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerDied","Data":"5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae"} Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.786870 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7l2pm_0ea2e803-34d0-429b-b943-ece0b9e38b63/console/0.log" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.787434 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959615 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959652 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.959937 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") pod \"0ea2e803-34d0-429b-b943-ece0b9e38b63\" (UID: \"0ea2e803-34d0-429b-b943-ece0b9e38b63\") " Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.960553 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.961016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config" (OuterVolumeSpecName: "console-config") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.961264 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.961309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca" (OuterVolumeSpecName: "service-ca") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.966178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.967013 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8" (OuterVolumeSpecName: "kube-api-access-gq4w8") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "kube-api-access-gq4w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:46.968098 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0ea2e803-34d0-429b-b943-ece0b9e38b63" (UID: "0ea2e803-34d0-429b-b943-ece0b9e38b63"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067226 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067301 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067331 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067358 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067394 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq4w8\" (UniqueName: \"kubernetes.io/projected/0ea2e803-34d0-429b-b943-ece0b9e38b63-kube-api-access-gq4w8\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067422 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.067448 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0ea2e803-34d0-429b-b943-ece0b9e38b63-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753207 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-7l2pm_0ea2e803-34d0-429b-b943-ece0b9e38b63/console/0.log" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753254 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-7l2pm" event={"ID":"0ea2e803-34d0-429b-b943-ece0b9e38b63","Type":"ContainerDied","Data":"f80f9d0a69e3b6c8de8df5e105815c2ea6a5c4fed2a8e106511494e31c10c8bf"} Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753288 4898 scope.go:117] "RemoveContainer" containerID="5f0a71c3382b8e97b2f21cf59a246a72cf36bc90c37659a0655800a7772d93ae" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.753330 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-7l2pm" Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.788506 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 14:05:47 crc kubenswrapper[4898]: I0313 14:05:47.792092 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-7l2pm"] Mar 13 14:05:49 crc kubenswrapper[4898]: I0313 14:05:49.089778 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:49 crc kubenswrapper[4898]: I0313 14:05:49.093668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 14:05:49 crc kubenswrapper[4898]: I0313 14:05:49.748894 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" path="/var/lib/kubelet/pods/0ea2e803-34d0-429b-b943-ece0b9e38b63/volumes" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.144954 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:06:00 crc kubenswrapper[4898]: E0313 14:06:00.146212 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.146245 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.146543 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea2e803-34d0-429b-b943-ece0b9e38b63" containerName="console" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.147417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.150135 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.150553 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.152458 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.155705 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.265571 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"auto-csr-approver-29556846-d6zpp\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.367535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"auto-csr-approver-29556846-d6zpp\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.402109 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"auto-csr-approver-29556846-d6zpp\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.485481 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:00 crc kubenswrapper[4898]: I0313 14:06:00.988184 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:06:01 crc kubenswrapper[4898]: I0313 14:06:01.869815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" event={"ID":"666e4c5d-e464-4b8a-b167-bc7624fc3e10","Type":"ContainerStarted","Data":"b0dd93a1baf74a5036475b03d3ad28cf7c8996d6700490a4a760588232120ad7"} Mar 13 14:06:02 crc kubenswrapper[4898]: I0313 14:06:02.885257 4898 generic.go:334] "Generic (PLEG): container finished" podID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerID="9c70e0bed8678da48508773f6b5163cca47cd975b196edd773fb1f955ef9672b" exitCode=0 Mar 13 14:06:02 crc kubenswrapper[4898]: I0313 14:06:02.885612 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" event={"ID":"666e4c5d-e464-4b8a-b167-bc7624fc3e10","Type":"ContainerDied","Data":"9c70e0bed8678da48508773f6b5163cca47cd975b196edd773fb1f955ef9672b"} Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.211881 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.347603 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") pod \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\" (UID: \"666e4c5d-e464-4b8a-b167-bc7624fc3e10\") " Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.353027 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp" (OuterVolumeSpecName: "kube-api-access-dhppp") pod "666e4c5d-e464-4b8a-b167-bc7624fc3e10" (UID: "666e4c5d-e464-4b8a-b167-bc7624fc3e10"). InnerVolumeSpecName "kube-api-access-dhppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.449531 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhppp\" (UniqueName: \"kubernetes.io/projected/666e4c5d-e464-4b8a-b167-bc7624fc3e10-kube-api-access-dhppp\") on node \"crc\" DevicePath \"\"" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.900092 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" event={"ID":"666e4c5d-e464-4b8a-b167-bc7624fc3e10","Type":"ContainerDied","Data":"b0dd93a1baf74a5036475b03d3ad28cf7c8996d6700490a4a760588232120ad7"} Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.900148 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0dd93a1baf74a5036475b03d3ad28cf7c8996d6700490a4a760588232120ad7" Mar 13 14:06:04 crc kubenswrapper[4898]: I0313 14:06:04.900160 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556846-d6zpp" Mar 13 14:06:05 crc kubenswrapper[4898]: I0313 14:06:05.270652 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:06:05 crc kubenswrapper[4898]: I0313 14:06:05.274935 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556840-vmqqn"] Mar 13 14:06:05 crc kubenswrapper[4898]: I0313 14:06:05.756185 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce" path="/var/lib/kubelet/pods/4b7eb8ef-6f92-4c29-b6ad-3cf5b6919fce/volumes" Mar 13 14:06:10 crc kubenswrapper[4898]: I0313 14:06:10.040747 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:06:10 crc kubenswrapper[4898]: I0313 14:06:10.089550 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:06:10 crc kubenswrapper[4898]: I0313 14:06:10.984864 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 14:06:19 crc kubenswrapper[4898]: I0313 14:06:19.134802 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:06:19 crc kubenswrapper[4898]: I0313 14:06:19.135243 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:06:49 crc kubenswrapper[4898]: I0313 14:06:49.134795 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:06:49 crc kubenswrapper[4898]: I0313 14:06:49.135633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.371212 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:07:03 crc kubenswrapper[4898]: E0313 14:07:03.371851 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerName="oc" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.371863 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerName="oc" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.371991 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" containerName="oc" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.372411 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380510 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.380994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.381126 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.392338 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482540 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482714 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.482923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.484380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.484674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.484681 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.497098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.500449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.501330 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.525017 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"console-758c8fb5b-pxts9\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.728870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:03 crc kubenswrapper[4898]: I0313 14:07:03.995604 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:07:04 crc kubenswrapper[4898]: I0313 14:07:04.328505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerStarted","Data":"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d"} Mar 13 14:07:04 crc kubenswrapper[4898]: I0313 14:07:04.328599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerStarted","Data":"4b09f73c7fa831fe94f3a344d5bf8593ff107c618a4ee0a2a0be061afa612208"} Mar 13 14:07:04 crc kubenswrapper[4898]: I0313 14:07:04.349647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-758c8fb5b-pxts9" podStartSLOduration=1.349621098 podStartE2EDuration="1.349621098s" podCreationTimestamp="2026-03-13 14:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:07:04.346856936 +0000 UTC m=+659.348445205" watchObservedRunningTime="2026-03-13 14:07:04.349621098 +0000 UTC m=+659.351209367" Mar 13 14:07:13 crc kubenswrapper[4898]: I0313 14:07:13.730009 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:13 crc kubenswrapper[4898]: I0313 14:07:13.730690 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:13 crc kubenswrapper[4898]: I0313 14:07:13.738067 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:14 crc kubenswrapper[4898]: I0313 14:07:14.415765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:07:14 crc kubenswrapper[4898]: I0313 14:07:14.497507 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.134531 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.134874 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.134959 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.135467 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.135532 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092" gracePeriod=600 Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.450498 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092" exitCode=0 Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.450567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092"} Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.450973 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87"} Mar 13 14:07:19 crc kubenswrapper[4898]: I0313 14:07:19.451007 4898 scope.go:117] "RemoveContainer" containerID="ef8034867c7dd4fe3e16f610be3edcf45ba0ba5b7440cc5634ef7ce86e520b52" Mar 13 14:07:32 crc kubenswrapper[4898]: I0313 14:07:32.123208 4898 scope.go:117] "RemoveContainer" containerID="f3acfddb5fa32ce7ed2202cdd792a2b1d7de4b1d204fbdc39e6814928f1b0f60" Mar 13 14:07:32 crc kubenswrapper[4898]: I0313 14:07:32.180361 4898 scope.go:117] "RemoveContainer" containerID="ce4b9269a12a5818cb6b78f9abcf90162aaab004f9bc6b1371639c51781f053a" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.545363 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-875645f9-l5trk" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" containerID="cri-o://5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" gracePeriod=15 Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.869741 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-875645f9-l5trk_441598c2-1b20-4109-8e38-46d414df93d7/console/0.log" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.869823 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939239 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939443 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939473 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939507 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939548 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.939572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") pod \"441598c2-1b20-4109-8e38-46d414df93d7\" (UID: \"441598c2-1b20-4109-8e38-46d414df93d7\") " Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940204 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config" (OuterVolumeSpecName: "console-config") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940274 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.940603 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.945673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.946174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:07:39 crc kubenswrapper[4898]: I0313 14:07:39.947102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp" (OuterVolumeSpecName: "kube-api-access-4fgxp") pod "441598c2-1b20-4109-8e38-46d414df93d7" (UID: "441598c2-1b20-4109-8e38-46d414df93d7"). InnerVolumeSpecName "kube-api-access-4fgxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041403 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041802 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041820 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fgxp\" (UniqueName: \"kubernetes.io/projected/441598c2-1b20-4109-8e38-46d414df93d7-kube-api-access-4fgxp\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041838 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041855 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041871 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/441598c2-1b20-4109-8e38-46d414df93d7-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.041886 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/441598c2-1b20-4109-8e38-46d414df93d7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594479 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-875645f9-l5trk_441598c2-1b20-4109-8e38-46d414df93d7/console/0.log" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594534 4898 generic.go:334] "Generic (PLEG): container finished" podID="441598c2-1b20-4109-8e38-46d414df93d7" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" exitCode=2 Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594562 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerDied","Data":"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575"} Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-875645f9-l5trk" event={"ID":"441598c2-1b20-4109-8e38-46d414df93d7","Type":"ContainerDied","Data":"a20af1488f994221a262c4ea1f9370dd06f8b682113715c773da210306924f29"} Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594605 4898 scope.go:117] "RemoveContainer" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.594684 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-875645f9-l5trk" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.619019 4898 scope.go:117] "RemoveContainer" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" Mar 13 14:07:40 crc kubenswrapper[4898]: E0313 14:07:40.619523 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575\": container with ID starting with 5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575 not found: ID does not exist" containerID="5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.619563 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575"} err="failed to get container status \"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575\": rpc error: code = NotFound desc = could not find container \"5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575\": container with ID starting with 5b6df02b2bb76e83e9982096d58a04d7e81453e98562dcd5520b3ecccb032575 not found: ID does not exist" Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.634801 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:07:40 crc kubenswrapper[4898]: I0313 14:07:40.647017 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-875645f9-l5trk"] Mar 13 14:07:41 crc kubenswrapper[4898]: I0313 14:07:41.751271 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441598c2-1b20-4109-8e38-46d414df93d7" path="/var/lib/kubelet/pods/441598c2-1b20-4109-8e38-46d414df93d7/volumes" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.146372 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:08:00 crc kubenswrapper[4898]: E0313 14:08:00.148552 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.148605 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.148890 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="441598c2-1b20-4109-8e38-46d414df93d7" containerName="console" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.149595 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.156471 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.157658 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.157815 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.158774 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.267503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"auto-csr-approver-29556848-wlplx\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.369199 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"auto-csr-approver-29556848-wlplx\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.395007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"auto-csr-approver-29556848-wlplx\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.470208 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:00 crc kubenswrapper[4898]: I0313 14:08:00.744975 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:08:01 crc kubenswrapper[4898]: I0313 14:08:01.760101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556848-wlplx" event={"ID":"fe4a848e-c06e-4205-a1a6-8b14b620096c","Type":"ContainerStarted","Data":"de2f8c2990bed5b76782f89bf8e53ea37a9fd9e636b3c223ad6b21a6bc7324ef"} Mar 13 14:08:02 crc kubenswrapper[4898]: I0313 14:08:02.775705 4898 generic.go:334] "Generic (PLEG): container finished" podID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerID="e5c3875fd4b0ad4fd5d4afba4c88238837f0d8b510bd53eb8f51d4cd510b00e3" exitCode=0 Mar 13 14:08:02 crc kubenswrapper[4898]: I0313 14:08:02.776012 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556848-wlplx" event={"ID":"fe4a848e-c06e-4205-a1a6-8b14b620096c","Type":"ContainerDied","Data":"e5c3875fd4b0ad4fd5d4afba4c88238837f0d8b510bd53eb8f51d4cd510b00e3"} Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.029666 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.225689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") pod \"fe4a848e-c06e-4205-a1a6-8b14b620096c\" (UID: \"fe4a848e-c06e-4205-a1a6-8b14b620096c\") " Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.235319 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d" (OuterVolumeSpecName: "kube-api-access-kcz6d") pod "fe4a848e-c06e-4205-a1a6-8b14b620096c" (UID: "fe4a848e-c06e-4205-a1a6-8b14b620096c"). InnerVolumeSpecName "kube-api-access-kcz6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.328633 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcz6d\" (UniqueName: \"kubernetes.io/projected/fe4a848e-c06e-4205-a1a6-8b14b620096c-kube-api-access-kcz6d\") on node \"crc\" DevicePath \"\"" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.795434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556848-wlplx" event={"ID":"fe4a848e-c06e-4205-a1a6-8b14b620096c","Type":"ContainerDied","Data":"de2f8c2990bed5b76782f89bf8e53ea37a9fd9e636b3c223ad6b21a6bc7324ef"} Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.795494 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2f8c2990bed5b76782f89bf8e53ea37a9fd9e636b3c223ad6b21a6bc7324ef" Mar 13 14:08:04 crc kubenswrapper[4898]: I0313 14:08:04.795518 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556848-wlplx" Mar 13 14:08:05 crc kubenswrapper[4898]: I0313 14:08:05.117818 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:08:05 crc kubenswrapper[4898]: I0313 14:08:05.125196 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556842-7h9s5"] Mar 13 14:08:05 crc kubenswrapper[4898]: I0313 14:08:05.751123 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9b9a59-64ad-4602-88da-91583ec126dc" path="/var/lib/kubelet/pods/8a9b9a59-64ad-4602-88da-91583ec126dc/volumes" Mar 13 14:09:19 crc kubenswrapper[4898]: I0313 14:09:19.133872 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:09:19 crc kubenswrapper[4898]: I0313 14:09:19.134407 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.819224 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4"] Mar 13 14:09:20 crc kubenswrapper[4898]: E0313 14:09:20.819988 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerName="oc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.820006 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerName="oc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.820139 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" containerName="oc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.821181 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.823484 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.838583 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4"] Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.882030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.882110 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.882138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.983654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.983715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.983848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.984407 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:20 crc kubenswrapper[4898]: I0313 14:09:20.985927 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:21 crc kubenswrapper[4898]: I0313 14:09:21.005406 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:21 crc kubenswrapper[4898]: I0313 14:09:21.146269 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:21 crc kubenswrapper[4898]: I0313 14:09:21.397306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4"] Mar 13 14:09:22 crc kubenswrapper[4898]: I0313 14:09:22.347335 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerID="6c588cd1db4036ae62f40d138136f3753bd0c24b2ce756684df9f722bb5a24c3" exitCode=0 Mar 13 14:09:22 crc kubenswrapper[4898]: I0313 14:09:22.347393 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"6c588cd1db4036ae62f40d138136f3753bd0c24b2ce756684df9f722bb5a24c3"} Mar 13 14:09:22 crc kubenswrapper[4898]: I0313 14:09:22.347453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerStarted","Data":"258e535120369cca7ef1d8dfbf0d7de2fe131e5eaa0fa710b99b16cb66c77d8d"} Mar 13 14:09:23 crc kubenswrapper[4898]: I0313 14:09:23.354521 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerStarted","Data":"658051b6bb84e6ddac50ed2fb6e62204dec9baecebe51934fc668d4dd52a472d"} Mar 13 14:09:24 crc kubenswrapper[4898]: I0313 14:09:24.363633 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerID="658051b6bb84e6ddac50ed2fb6e62204dec9baecebe51934fc668d4dd52a472d" exitCode=0 Mar 13 14:09:24 crc kubenswrapper[4898]: I0313 14:09:24.363992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"658051b6bb84e6ddac50ed2fb6e62204dec9baecebe51934fc668d4dd52a472d"} Mar 13 14:09:25 crc kubenswrapper[4898]: I0313 14:09:25.383992 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerID="b8f285efaaf0b4a74b49a55cc2187ddf19137e8651515ab85ed1254b531d18b9" exitCode=0 Mar 13 14:09:25 crc kubenswrapper[4898]: I0313 14:09:25.384047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"b8f285efaaf0b4a74b49a55cc2187ddf19137e8651515ab85ed1254b531d18b9"} Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.671747 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.811400 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") pod \"dd46f989-e694-47a9-9b46-e96b7b47e403\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.811674 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") pod \"dd46f989-e694-47a9-9b46-e96b7b47e403\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.811720 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") pod \"dd46f989-e694-47a9-9b46-e96b7b47e403\" (UID: \"dd46f989-e694-47a9-9b46-e96b7b47e403\") " Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.815738 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle" (OuterVolumeSpecName: "bundle") pod "dd46f989-e694-47a9-9b46-e96b7b47e403" (UID: "dd46f989-e694-47a9-9b46-e96b7b47e403"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.820998 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr" (OuterVolumeSpecName: "kube-api-access-c2dcr") pod "dd46f989-e694-47a9-9b46-e96b7b47e403" (UID: "dd46f989-e694-47a9-9b46-e96b7b47e403"). InnerVolumeSpecName "kube-api-access-c2dcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.847082 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util" (OuterVolumeSpecName: "util") pod "dd46f989-e694-47a9-9b46-e96b7b47e403" (UID: "dd46f989-e694-47a9-9b46-e96b7b47e403"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.913007 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.913040 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd46f989-e694-47a9-9b46-e96b7b47e403-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:26 crc kubenswrapper[4898]: I0313 14:09:26.913157 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2dcr\" (UniqueName: \"kubernetes.io/projected/dd46f989-e694-47a9-9b46-e96b7b47e403-kube-api-access-c2dcr\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:27 crc kubenswrapper[4898]: I0313 14:09:27.402865 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" event={"ID":"dd46f989-e694-47a9-9b46-e96b7b47e403","Type":"ContainerDied","Data":"258e535120369cca7ef1d8dfbf0d7de2fe131e5eaa0fa710b99b16cb66c77d8d"} Mar 13 14:09:27 crc kubenswrapper[4898]: I0313 14:09:27.403229 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="258e535120369cca7ef1d8dfbf0d7de2fe131e5eaa0fa710b99b16cb66c77d8d" Mar 13 14:09:27 crc kubenswrapper[4898]: I0313 14:09:27.402939 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4" Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.890408 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892094 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" containerID="cri-o://14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892189 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" containerID="cri-o://d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892308 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" containerID="cri-o://86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892348 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" containerID="cri-o://d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892315 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892429 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" containerID="cri-o://7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.892461 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" containerID="cri-o://0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" gracePeriod=30 Mar 13 14:09:31 crc kubenswrapper[4898]: I0313 14:09:31.961440 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" containerID="cri-o://16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" gracePeriod=30 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.272540 4898 scope.go:117] "RemoveContainer" containerID="529194ce4ac0e19d00515e6fc6f6984803e8a03afdee3263ba4e434f1a13a57b" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.298621 4898 scope.go:117] "RemoveContainer" containerID="5ca8f8a8a536aca56f73dd6928361e5dd5f98f66d3bc35762461d5d87c0c3022" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.322962 4898 scope.go:117] "RemoveContainer" containerID="04f48cdfeeb82223cb0cab3fb50d3338225f39b1d78eadc3c18a46350ae28770" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.439744 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-acl-logging/0.log" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.440658 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-controller/0.log" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441366 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441570 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441731 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.441937 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" exitCode=0 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442102 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" exitCode=143 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442255 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" exitCode=143 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442515 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.442959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.443126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.443266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.443529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.445952 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/2.log" Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.446132 4898 generic.go:334] "Generic (PLEG): container finished" podID="e521c857-9711-4f68-886f-38b233d7b05b" containerID="725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e" exitCode=2 Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.446285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerDied","Data":"725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e"} Mar 13 14:09:32 crc kubenswrapper[4898]: I0313 14:09:32.447189 4898 scope.go:117] "RemoveContainer" containerID="725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e" Mar 13 14:09:32 crc kubenswrapper[4898]: E0313 14:09:32.447689 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6llfs_openshift-multus(e521c857-9711-4f68-886f-38b233d7b05b)\"" pod="openshift-multus/multus-6llfs" podUID="e521c857-9711-4f68-886f-38b233d7b05b" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.188367 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-acl-logging/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.188781 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-controller/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.189199 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.207818 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208181 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208439 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208716 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.207969 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208651 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208723 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208777 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log" (OuterVolumeSpecName: "node-log") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.208806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209055 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209105 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209131 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209203 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209240 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209343 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209315 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209325 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209305 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209471 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209596 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") pod \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\" (UID: \"e7d6afc0-d9b5-41b2-a55f-57621c300cbb\") " Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209868 4898 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209917 4898 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209929 4898 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209939 4898 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209949 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209957 4898 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209965 4898 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209982 4898 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.209992 4898 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210001 4898 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket" (OuterVolumeSpecName: "log-socket") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash" (OuterVolumeSpecName: "host-slash") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210474 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210496 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.210693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.215556 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.215681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944" (OuterVolumeSpecName: "kube-api-access-tc944") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "kube-api-access-tc944". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.237500 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e7d6afc0-d9b5-41b2-a55f-57621c300cbb" (UID: "e7d6afc0-d9b5-41b2-a55f-57621c300cbb"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243365 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l82wk"] Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243637 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kubecfg-setup" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243653 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kubecfg-setup" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243664 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243671 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243683 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243717 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243726 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243733 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243741 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243748 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243766 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243786 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243804 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="util" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243811 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="util" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243818 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243825 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243838 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243845 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243856 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="extract" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243862 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="extract" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243871 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243877 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243887 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="pull" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243893 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="pull" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243920 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243927 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.243939 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.243946 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244072 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244098 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244111 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244122 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd46f989-e694-47a9-9b46-e96b7b47e403" containerName="extract" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244132 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244156 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovn-acl-logging" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244163 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244174 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="sbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244183 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="northd" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244197 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-node" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244209 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244224 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="nbdb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.244380 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244392 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.244552 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerName="ovnkube-controller" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.247408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310771 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-etc-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-var-lib-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310871 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-bin\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310914 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-slash\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310933 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-script-lib\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310948 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-systemd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-ovn\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.310982 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-systemd-units\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-env-overrides\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-netd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311048 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-netns\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovn-node-metrics-cert\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrwpw\" (UniqueName: \"kubernetes.io/projected/4a0b9ad6-156f-418b-8eae-1d762f8161dd-kube-api-access-xrwpw\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311135 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-kubelet\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311150 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-log-socket\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311165 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-config\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-node-log\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311217 4898 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311241 4898 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311251 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311259 4898 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311268 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc944\" (UniqueName: \"kubernetes.io/projected/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-kube-api-access-tc944\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311277 4898 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311285 4898 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311293 4898 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311301 4898 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.311308 4898 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7d6afc0-d9b5-41b2-a55f-57621c300cbb-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.412982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-env-overrides\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413341 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-netd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-netns\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413406 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovn-node-metrics-cert\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrwpw\" (UniqueName: \"kubernetes.io/projected/4a0b9ad6-156f-418b-8eae-1d762f8161dd-kube-api-access-xrwpw\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-kubelet\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-log-socket\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413585 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-config\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413613 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-node-log\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-etc-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413670 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-var-lib-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-bin\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413749 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-env-overrides\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413766 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-slash\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-script-lib\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-systemd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-ovn\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-netns\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-systemd-units\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-run-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414207 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-kubelet\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414236 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-slash\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413816 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-log-socket\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.413836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-netd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414612 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414640 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-systemd\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414660 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-ovn\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-var-lib-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414684 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-systemd-units\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-node-log\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-run-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414708 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-config\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-etc-openvswitch\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.414741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a0b9ad6-156f-418b-8eae-1d762f8161dd-host-cni-bin\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.415191 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovnkube-script-lib\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.416990 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4a0b9ad6-156f-418b-8eae-1d762f8161dd-ovn-node-metrics-cert\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.433304 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrwpw\" (UniqueName: \"kubernetes.io/projected/4a0b9ad6-156f-418b-8eae-1d762f8161dd-kube-api-access-xrwpw\") pod \"ovnkube-node-l82wk\" (UID: \"4a0b9ad6-156f-418b-8eae-1d762f8161dd\") " pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.455257 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-acl-logging/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.455854 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qqqs5_e7d6afc0-d9b5-41b2-a55f-57621c300cbb/ovn-controller/0.log" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456169 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" exitCode=0 Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456194 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" exitCode=0 Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456214 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456249 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" event={"ID":"e7d6afc0-d9b5-41b2-a55f-57621c300cbb","Type":"ContainerDied","Data":"064d66ce778a8d0d979727a052c6e1249a726f86c9609bd927debcbbf5923b70"} Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456264 4898 scope.go:117] "RemoveContainer" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.456403 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qqqs5" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.478284 4898 scope.go:117] "RemoveContainer" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.495215 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.502410 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qqqs5"] Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.502572 4898 scope.go:117] "RemoveContainer" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.518509 4898 scope.go:117] "RemoveContainer" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.540353 4898 scope.go:117] "RemoveContainer" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.552811 4898 scope.go:117] "RemoveContainer" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.563480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.573674 4898 scope.go:117] "RemoveContainer" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.593182 4898 scope.go:117] "RemoveContainer" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.610463 4898 scope.go:117] "RemoveContainer" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.630521 4898 scope.go:117] "RemoveContainer" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.634275 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": container with ID starting with 16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed not found: ID does not exist" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.634327 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} err="failed to get container status \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": rpc error: code = NotFound desc = could not find container \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": container with ID starting with 16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.634366 4898 scope.go:117] "RemoveContainer" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.635137 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": container with ID starting with 86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0 not found: ID does not exist" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.635162 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} err="failed to get container status \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": rpc error: code = NotFound desc = could not find container \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": container with ID starting with 86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.635181 4898 scope.go:117] "RemoveContainer" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.636248 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": container with ID starting with d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2 not found: ID does not exist" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636279 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} err="failed to get container status \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": rpc error: code = NotFound desc = could not find container \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": container with ID starting with d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636297 4898 scope.go:117] "RemoveContainer" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.636528 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": container with ID starting with d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23 not found: ID does not exist" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636555 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} err="failed to get container status \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": rpc error: code = NotFound desc = could not find container \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": container with ID starting with d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636572 4898 scope.go:117] "RemoveContainer" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.636948 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": container with ID starting with 3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61 not found: ID does not exist" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636974 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} err="failed to get container status \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": rpc error: code = NotFound desc = could not find container \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": container with ID starting with 3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.636988 4898 scope.go:117] "RemoveContainer" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.637346 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": container with ID starting with 0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb not found: ID does not exist" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637371 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} err="failed to get container status \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": rpc error: code = NotFound desc = could not find container \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": container with ID starting with 0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637390 4898 scope.go:117] "RemoveContainer" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.637696 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": container with ID starting with 7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345 not found: ID does not exist" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637720 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} err="failed to get container status \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": rpc error: code = NotFound desc = could not find container \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": container with ID starting with 7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.637735 4898 scope.go:117] "RemoveContainer" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.638957 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": container with ID starting with 14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453 not found: ID does not exist" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.638991 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} err="failed to get container status \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": rpc error: code = NotFound desc = could not find container \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": container with ID starting with 14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639011 4898 scope.go:117] "RemoveContainer" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: E0313 14:09:33.639331 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": container with ID starting with dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786 not found: ID does not exist" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639358 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786"} err="failed to get container status \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": rpc error: code = NotFound desc = could not find container \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": container with ID starting with dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639375 4898 scope.go:117] "RemoveContainer" containerID="16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639610 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed"} err="failed to get container status \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": rpc error: code = NotFound desc = could not find container \"16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed\": container with ID starting with 16ce106ae8a28f129efde86037be3a1f3a7bbf53e4df0b306b61a11eff910aed not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639634 4898 scope.go:117] "RemoveContainer" containerID="86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639864 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0"} err="failed to get container status \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": rpc error: code = NotFound desc = could not find container \"86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0\": container with ID starting with 86c51c384666cdb892f8b5b79b57ab019212cde466cfbf647bb2982f21dce6b0 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.639888 4898 scope.go:117] "RemoveContainer" containerID="d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640141 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2"} err="failed to get container status \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": rpc error: code = NotFound desc = could not find container \"d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2\": container with ID starting with d579a6a419e66fa7b8a93f42121474675772b87aa7f9692f3ab6e71140e78df2 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640164 4898 scope.go:117] "RemoveContainer" containerID="d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640396 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23"} err="failed to get container status \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": rpc error: code = NotFound desc = could not find container \"d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23\": container with ID starting with d01724f2a8bc5c705495e6811179e32ae0b5ae5a68d3433310e1d7e1e6e08c23 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640418 4898 scope.go:117] "RemoveContainer" containerID="3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640677 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61"} err="failed to get container status \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": rpc error: code = NotFound desc = could not find container \"3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61\": container with ID starting with 3c557c9116911c2b860f450e4b1d9e7cebc71875f814730b0f7f9129514ccf61 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640700 4898 scope.go:117] "RemoveContainer" containerID="0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.640980 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb"} err="failed to get container status \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": rpc error: code = NotFound desc = could not find container \"0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb\": container with ID starting with 0ce3ec0ae90df3a68589ac583429d3a3c6362fae7b2e0d60a155cbaa99619adb not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641007 4898 scope.go:117] "RemoveContainer" containerID="7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641256 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345"} err="failed to get container status \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": rpc error: code = NotFound desc = could not find container \"7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345\": container with ID starting with 7f57b5eff7622c94f9a3765b40ed71b435d72b30fc2c25585762ad76b70a8345 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641283 4898 scope.go:117] "RemoveContainer" containerID="14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641500 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453"} err="failed to get container status \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": rpc error: code = NotFound desc = could not find container \"14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453\": container with ID starting with 14780cc4ee2ff106ee99b57b06182e3ba0b380a43c190897e00aa8c6f9d31453 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641525 4898 scope.go:117] "RemoveContainer" containerID="dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.641735 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786"} err="failed to get container status \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": rpc error: code = NotFound desc = could not find container \"dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786\": container with ID starting with dea58a789d57a7e73c0099990d685a2ef5b7c7a07d4ca8bd4fce7779990d9786 not found: ID does not exist" Mar 13 14:09:33 crc kubenswrapper[4898]: I0313 14:09:33.751122 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d6afc0-d9b5-41b2-a55f-57621c300cbb" path="/var/lib/kubelet/pods/e7d6afc0-d9b5-41b2-a55f-57621c300cbb/volumes" Mar 13 14:09:34 crc kubenswrapper[4898]: I0313 14:09:34.463357 4898 generic.go:334] "Generic (PLEG): container finished" podID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerID="d0fa021e24bbaf323086d2f0ca9344418651c6408325675373378bef04786f4d" exitCode=0 Mar 13 14:09:34 crc kubenswrapper[4898]: I0313 14:09:34.463449 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerDied","Data":"d0fa021e24bbaf323086d2f0ca9344418651c6408325675373378bef04786f4d"} Mar 13 14:09:34 crc kubenswrapper[4898]: I0313 14:09:34.463781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"2165dbbf05ae9ac0b2d925a7a188178859abd2b34f765e588a20716431a9c72e"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474426 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"34a580f53162cf26e9e4d69b61e7c5aaae36bc62a1676dc79dbaefa0ee348097"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"e29c8f382fd2b10cf7fdcdc74768e35f87c892171068990ee4300be25ea3784b"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"8bf965473f4dd435e52cc0ad785b214411845f1a2d4651d06b70fc0af1a48f02"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"19e7ccc92da0c8ab9f95b946661f1c36140aa2c235919fa28e61d51f0b9c3944"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474765 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"3d5102150d4c4f9095f479599cfdfcdc3cf96da057be9255999245de491d00dc"} Mar 13 14:09:35 crc kubenswrapper[4898]: I0313 14:09:35.474774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"c8044828df8054b6f2b1f1906a92ec5c8265734ed70cf3907b5975d90f9eff88"} Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.498328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"c319424207abd7c622d86ae3eed7bd449515b03b902c401221c8282bb978ec98"} Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.600179 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.601121 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.603365 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.603419 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-bjwfj" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.603770 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.649268 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.650005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.651702 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.651930 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-8qwjn" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.656720 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.657423 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.677042 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx8t5\" (UniqueName: \"kubernetes.io/projected/30c06063-b926-4f2e-b8d1-8c530cc5b0a9-kube-api-access-fx8t5\") pod \"obo-prometheus-operator-68bc856cb9-5r9gm\" (UID: \"30c06063-b926-4f2e-b8d1-8c530cc5b0a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778556 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx8t5\" (UniqueName: \"kubernetes.io/projected/30c06063-b926-4f2e-b8d1-8c530cc5b0a9-kube-api-access-fx8t5\") pod \"obo-prometheus-operator-68bc856cb9-5r9gm\" (UID: \"30c06063-b926-4f2e-b8d1-8c530cc5b0a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.778638 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.807926 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx8t5\" (UniqueName: \"kubernetes.io/projected/30c06063-b926-4f2e-b8d1-8c530cc5b0a9-kube-api-access-fx8t5\") pod \"obo-prometheus-operator-68bc856cb9-5r9gm\" (UID: \"30c06063-b926-4f2e-b8d1-8c530cc5b0a9\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.841306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ljrtz"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.842242 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.844608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.844853 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mflt7" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.879992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.880063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.880679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.880786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.887859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.888943 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.891367 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c190eee-747b-4a45-905c-fa0235080305-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm\" (UID: \"8c190eee-747b-4a45-905c-fa0235080305\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.891758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/951cfcfc-3a8c-410e-a3f5-f5caa10511f5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz\" (UID: \"951cfcfc-3a8c-410e-a3f5-f5caa10511f5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.916337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944527 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944603 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944628 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.944693 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(fd78077a86f3997d29fd985fce1622f05819a7419fc771677e5a2e01de020bb1): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" podUID="30c06063-b926-4f2e-b8d1-8c530cc5b0a9" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.974229 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.982780 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bfc0332-bb59-42bf-bb70-462efa225c81-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.982873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxlrb\" (UniqueName: \"kubernetes.io/projected/3bfc0332-bb59-42bf-bb70-462efa225c81-kube-api-access-qxlrb\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.993826 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-nkt76"] Mar 13 14:09:38 crc kubenswrapper[4898]: I0313 14:09:38.993889 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995043 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995081 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995097 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:38 crc kubenswrapper[4898]: E0313 14:09:38.995132 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(434f5ab9b2b2b566b310d245370ebef6cfceab68a771f8cacfe211a475c0830e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" podUID="8c190eee-747b-4a45-905c-fa0235080305" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.007673 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.010018 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-glbkw" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020048 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020094 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020116 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.020159 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(340dfeabe1c085bb98589e56ab7efcfddb37980949c9fe6bc883f55a4c2e7dae): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" podUID="951cfcfc-3a8c-410e-a3f5-f5caa10511f5" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv8gd\" (UniqueName: \"kubernetes.io/projected/79ead8ee-67ba-4831-b5d4-a1f128e94334-kube-api-access-nv8gd\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bfc0332-bb59-42bf-bb70-462efa225c81-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084870 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxlrb\" (UniqueName: \"kubernetes.io/projected/3bfc0332-bb59-42bf-bb70-462efa225c81-kube-api-access-qxlrb\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.084913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79ead8ee-67ba-4831-b5d4-a1f128e94334-openshift-service-ca\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.088307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/3bfc0332-bb59-42bf-bb70-462efa225c81-observability-operator-tls\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.099064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxlrb\" (UniqueName: \"kubernetes.io/projected/3bfc0332-bb59-42bf-bb70-462efa225c81-kube-api-access-qxlrb\") pod \"observability-operator-59bdc8b94-ljrtz\" (UID: \"3bfc0332-bb59-42bf-bb70-462efa225c81\") " pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.160426 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181196 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181271 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181296 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.181347 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a01d8868dd6af20cebed5d1574243b61ad97b1045cc2d34c8db961f58acf96b7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.186279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv8gd\" (UniqueName: \"kubernetes.io/projected/79ead8ee-67ba-4831-b5d4-a1f128e94334-kube-api-access-nv8gd\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.186386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79ead8ee-67ba-4831-b5d4-a1f128e94334-openshift-service-ca\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.187553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/79ead8ee-67ba-4831-b5d4-a1f128e94334-openshift-service-ca\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.212366 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv8gd\" (UniqueName: \"kubernetes.io/projected/79ead8ee-67ba-4831-b5d4-a1f128e94334-kube-api-access-nv8gd\") pod \"perses-operator-5bf474d74f-nkt76\" (UID: \"79ead8ee-67ba-4831-b5d4-a1f128e94334\") " pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: I0313 14:09:39.339732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.364992 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.365075 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.365099 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:39 crc kubenswrapper[4898]: E0313 14:09:39.365151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(af5e6a9b24c7090da4c8b0b588b4ab38c4ec87cf04e8a847cf9dcce5ac8d5474): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" event={"ID":"4a0b9ad6-156f-418b-8eae-1d762f8161dd","Type":"ContainerStarted","Data":"2e24c39eec65888580ebf7712fdc5741633e4341d3fc40409cf2f068b2581ee1"} Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517858 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517873 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.517882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.560698 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.564101 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.590940 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podStartSLOduration=7.590921168 podStartE2EDuration="7.590921168s" podCreationTimestamp="2026-03-13 14:09:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:09:40.587018206 +0000 UTC m=+815.588606455" watchObservedRunningTime="2026-03-13 14:09:40.590921168 +0000 UTC m=+815.592509407" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.947964 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm"] Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.948261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.948650 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.981760 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz"] Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.981926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.982483 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.991010 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm"] Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991097 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991176 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991203 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:40 crc kubenswrapper[4898]: E0313 14:09:40.991255 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators(30c06063-b926-4f2e-b8d1-8c530cc5b0a9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-5r9gm_openshift-operators_30c06063-b926-4f2e-b8d1-8c530cc5b0a9_0(82a93402f36533955f91ec508945b8cda8200417bb6d35b063d78bd40005522b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" podUID="30c06063-b926-4f2e-b8d1-8c530cc5b0a9" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.991125 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.992076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.995463 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ljrtz"] Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.995553 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:40 crc kubenswrapper[4898]: I0313 14:09:40.995985 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:41 crc kubenswrapper[4898]: I0313 14:09:41.021810 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-nkt76"] Mar 13 14:09:41 crc kubenswrapper[4898]: I0313 14:09:41.021926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: I0313 14:09:41.022364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032388 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032445 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032467 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.032508 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators(951cfcfc-3a8c-410e-a3f5-f5caa10511f5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_openshift-operators_951cfcfc-3a8c-410e-a3f5-f5caa10511f5_0(fb1bea47dbbc4277faf1b20a987bbf5c1da7ad0636c1822d94e5bce05e1eae4a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" podUID="951cfcfc-3a8c-410e-a3f5-f5caa10511f5" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037783 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037854 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037881 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.037951 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators(8c190eee-747b-4a45-905c-fa0235080305)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_openshift-operators_8c190eee-747b-4a45-905c-fa0235080305_0(c09742bb7ffa4a1eaa8f0aae68549bee73726dfbbfdb07e2af4c746bc829cd48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" podUID="8c190eee-747b-4a45-905c-fa0235080305" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063716 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063780 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063804 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.063848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-ljrtz_openshift-operators(3bfc0332-bb59-42bf-bb70-462efa225c81)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-ljrtz_openshift-operators_3bfc0332-bb59-42bf-bb70-462efa225c81_0(a4b948d1ba67e46bdf493134823185ec13e2283a0e8570c7b7a8bc515287a5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.069940 4898 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.070015 4898 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.070042 4898 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:41 crc kubenswrapper[4898]: E0313 14:09:41.070109 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-nkt76_openshift-operators(79ead8ee-67ba-4831-b5d4-a1f128e94334)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-nkt76_openshift-operators_79ead8ee-67ba-4831-b5d4-a1f128e94334_0(ab47d1d3c53b6eb35e4ba571d13de11e8ae1d29e947373707aff0bc8f5858f5b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" Mar 13 14:09:44 crc kubenswrapper[4898]: I0313 14:09:44.739933 4898 scope.go:117] "RemoveContainer" containerID="725f30c48676665ebc628a8b35e81161dc13d717e27cad14806022f5ad267e0e" Mar 13 14:09:45 crc kubenswrapper[4898]: I0313 14:09:45.546019 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6llfs_e521c857-9711-4f68-886f-38b233d7b05b/kube-multus/2.log" Mar 13 14:09:45 crc kubenswrapper[4898]: I0313 14:09:45.546247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6llfs" event={"ID":"e521c857-9711-4f68-886f-38b233d7b05b","Type":"ContainerStarted","Data":"df8ecda55c092e017992e19b5f998c9f088fed5b480868d3f462891075b0153f"} Mar 13 14:09:49 crc kubenswrapper[4898]: I0313 14:09:49.134883 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:09:49 crc kubenswrapper[4898]: I0313 14:09:49.135132 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:09:51 crc kubenswrapper[4898]: I0313 14:09:51.739726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:51 crc kubenswrapper[4898]: I0313 14:09:51.740681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.236268 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz"] Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.599760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" event={"ID":"951cfcfc-3a8c-410e-a3f5-f5caa10511f5","Type":"ContainerStarted","Data":"23f045df4f9087c4ff8d6ed017051a37ccdf60108672e0428bb4de83d9dd4bf9"} Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.738435 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.738524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.738981 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" Mar 13 14:09:52 crc kubenswrapper[4898]: I0313 14:09:52.739312 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.036649 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-ljrtz"] Mar 13 14:09:53 crc kubenswrapper[4898]: W0313 14:09:53.049942 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bfc0332_bb59_42bf_bb70_462efa225c81.slice/crio-3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4 WatchSource:0}: Error finding container 3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4: Status 404 returned error can't find the container with id 3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4 Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.053009 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.102613 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm"] Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.656218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerStarted","Data":"3a86f8990e921c42fb38a682be0142db0b8c0896d0e7b7d5889c117b24ba14a4"} Mar 13 14:09:53 crc kubenswrapper[4898]: I0313 14:09:53.658077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" event={"ID":"30c06063-b926-4f2e-b8d1-8c530cc5b0a9","Type":"ContainerStarted","Data":"04906ee9b15921d94f0048d0f4560ee05f5365765a870aca2723697b270f581b"} Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.745101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.745141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.748235 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:09:55 crc kubenswrapper[4898]: I0313 14:09:55.748544 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.125818 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.127302 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.129936 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.130025 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.133258 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.135601 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.254993 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"auto-csr-approver-29556850-n2t8z\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.355842 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"auto-csr-approver-29556850-n2t8z\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.382712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"auto-csr-approver-29556850-n2t8z\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:00 crc kubenswrapper[4898]: I0313 14:10:00.450950 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:01 crc kubenswrapper[4898]: I0313 14:10:01.807389 4898 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 14:10:01 crc kubenswrapper[4898]: I0313 14:10:01.954474 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-nkt76"] Mar 13 14:10:01 crc kubenswrapper[4898]: W0313 14:10:01.959970 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ead8ee_67ba_4831_b5d4_a1f128e94334.slice/crio-b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29 WatchSource:0}: Error finding container b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29: Status 404 returned error can't find the container with id b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29 Mar 13 14:10:01 crc kubenswrapper[4898]: I0313 14:10:01.997984 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm"] Mar 13 14:10:02 crc kubenswrapper[4898]: W0313 14:10:02.001235 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c190eee_747b_4a45_905c_fa0235080305.slice/crio-96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf WatchSource:0}: Error finding container 96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf: Status 404 returned error can't find the container with id 96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.099130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:10:02 crc kubenswrapper[4898]: W0313 14:10:02.107726 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda05c0334_f9cf_4640_a763_6d77b983193c.slice/crio-703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2 WatchSource:0}: Error finding container 703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2: Status 404 returned error can't find the container with id 703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2 Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.736720 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" event={"ID":"8c190eee-747b-4a45-905c-fa0235080305","Type":"ContainerStarted","Data":"a76e2eb322540774d9f66e1ed73bc4b272395b3d03fa97bedca8cc39eff74e7a"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.737285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" event={"ID":"8c190eee-747b-4a45-905c-fa0235080305","Type":"ContainerStarted","Data":"96f9dd42ff5b1aebdc6ef0b6db373b3200a06de09fa1013ee3f3af54a01a2ecf"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.742534 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" event={"ID":"a05c0334-f9cf-4640-a763-6d77b983193c","Type":"ContainerStarted","Data":"703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.743737 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" event={"ID":"79ead8ee-67ba-4831-b5d4-a1f128e94334","Type":"ContainerStarted","Data":"b7c5be9a61cc19a00ef13aa21a2c2477aeaa69117d267d31a61ade4241454c29"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.744979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerStarted","Data":"00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.752327 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" event={"ID":"951cfcfc-3a8c-410e-a3f5-f5caa10511f5","Type":"ContainerStarted","Data":"db50c338f94e0187fbc033a0ecb6512586df23aa14a822181e8a355b65fc1eb4"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.755753 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" event={"ID":"30c06063-b926-4f2e-b8d1-8c530cc5b0a9","Type":"ContainerStarted","Data":"636c13a9a9313e60b5edc2b20690895e5af55bdbfefb1b95d3ad2e8d0edc22f7"} Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.770945 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm" podStartSLOduration=24.770890918 podStartE2EDuration="24.770890918s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:10:02.760771693 +0000 UTC m=+837.762359942" watchObservedRunningTime="2026-03-13 14:10:02.770890918 +0000 UTC m=+837.772479177" Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.799622 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podStartSLOduration=16.179661151 podStartE2EDuration="24.799592188s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:09:53.052636888 +0000 UTC m=+828.054225127" lastFinishedPulling="2026-03-13 14:10:01.672567925 +0000 UTC m=+836.674156164" observedRunningTime="2026-03-13 14:10:02.786072745 +0000 UTC m=+837.787661004" watchObservedRunningTime="2026-03-13 14:10:02.799592188 +0000 UTC m=+837.801180427" Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.815696 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-5r9gm" podStartSLOduration=16.335436934 podStartE2EDuration="24.815680279s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:09:53.116619571 +0000 UTC m=+828.118207810" lastFinishedPulling="2026-03-13 14:10:01.596862916 +0000 UTC m=+836.598451155" observedRunningTime="2026-03-13 14:10:02.814100057 +0000 UTC m=+837.815688306" watchObservedRunningTime="2026-03-13 14:10:02.815680279 +0000 UTC m=+837.817268518" Mar 13 14:10:02 crc kubenswrapper[4898]: I0313 14:10:02.847424 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz" podStartSLOduration=15.500484057 podStartE2EDuration="24.847394768s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:09:52.250489899 +0000 UTC m=+827.252078138" lastFinishedPulling="2026-03-13 14:10:01.59740061 +0000 UTC m=+836.598988849" observedRunningTime="2026-03-13 14:10:02.837371386 +0000 UTC m=+837.838959645" watchObservedRunningTime="2026-03-13 14:10:02.847394768 +0000 UTC m=+837.848983007" Mar 13 14:10:03 crc kubenswrapper[4898]: I0313 14:10:03.584073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" Mar 13 14:10:03 crc kubenswrapper[4898]: I0313 14:10:03.771043 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:10:03 crc kubenswrapper[4898]: I0313 14:10:03.772313 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 14:10:04 crc kubenswrapper[4898]: I0313 14:10:04.773989 4898 generic.go:334] "Generic (PLEG): container finished" podID="a05c0334-f9cf-4640-a763-6d77b983193c" containerID="b0888e4d135b3b37fbe96fe16a03f870ba37d4188b89aa723dcdb2298a0e4ed8" exitCode=0 Mar 13 14:10:04 crc kubenswrapper[4898]: I0313 14:10:04.774184 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" event={"ID":"a05c0334-f9cf-4640-a763-6d77b983193c","Type":"ContainerDied","Data":"b0888e4d135b3b37fbe96fe16a03f870ba37d4188b89aa723dcdb2298a0e4ed8"} Mar 13 14:10:05 crc kubenswrapper[4898]: I0313 14:10:05.785280 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" event={"ID":"79ead8ee-67ba-4831-b5d4-a1f128e94334","Type":"ContainerStarted","Data":"1627f2dbe0116af304a8553ca13f81e420fb301d6d5599ae9403981065700b49"} Mar 13 14:10:05 crc kubenswrapper[4898]: I0313 14:10:05.811853 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podStartSLOduration=24.572284196 podStartE2EDuration="27.811835316s" podCreationTimestamp="2026-03-13 14:09:38 +0000 UTC" firstStartedPulling="2026-03-13 14:10:01.962210527 +0000 UTC m=+836.963798766" lastFinishedPulling="2026-03-13 14:10:05.201761647 +0000 UTC m=+840.203349886" observedRunningTime="2026-03-13 14:10:05.805678975 +0000 UTC m=+840.807267214" watchObservedRunningTime="2026-03-13 14:10:05.811835316 +0000 UTC m=+840.813423555" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.057988 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.158332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") pod \"a05c0334-f9cf-4640-a763-6d77b983193c\" (UID: \"a05c0334-f9cf-4640-a763-6d77b983193c\") " Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.164078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd" (OuterVolumeSpecName: "kube-api-access-r6jtd") pod "a05c0334-f9cf-4640-a763-6d77b983193c" (UID: "a05c0334-f9cf-4640-a763-6d77b983193c"). InnerVolumeSpecName "kube-api-access-r6jtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.260004 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6jtd\" (UniqueName: \"kubernetes.io/projected/a05c0334-f9cf-4640-a763-6d77b983193c-kube-api-access-r6jtd\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793620 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" event={"ID":"a05c0334-f9cf-4640-a763-6d77b983193c","Type":"ContainerDied","Data":"703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2"} Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793684 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="703960e572c96d130d5564f52c50c0edfa9bd69efc58b5c61ac8e3057d73fac2" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793729 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:10:06 crc kubenswrapper[4898]: I0313 14:10:06.793640 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556850-n2t8z" Mar 13 14:10:07 crc kubenswrapper[4898]: I0313 14:10:07.108221 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:10:07 crc kubenswrapper[4898]: I0313 14:10:07.113265 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556844-q7h28"] Mar 13 14:10:07 crc kubenswrapper[4898]: I0313 14:10:07.748770 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd30282f-65c8-45d8-89f3-c6e2f16662d4" path="/var/lib/kubelet/pods/cd30282f-65c8-45d8-89f3-c6e2f16662d4/volumes" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.823029 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-krzxz"] Mar 13 14:10:10 crc kubenswrapper[4898]: E0313 14:10:10.823844 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" containerName="oc" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.823860 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" containerName="oc" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.824040 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" containerName="oc" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.824587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.826663 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.827466 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.827474 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7fcw5" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.830387 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-krzxz"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.840292 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fsdjh"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.841383 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.845235 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ddxqr" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.855964 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cx9pw"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.856912 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.859646 4898 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-qxnnv" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.860671 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fsdjh"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.873439 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cx9pw"] Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.935431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njm7l\" (UniqueName: \"kubernetes.io/projected/d00b7135-a080-4f0e-a23b-237ab821410f-kube-api-access-njm7l\") pod \"cert-manager-cainjector-cf98fcc89-krzxz\" (UID: \"d00b7135-a080-4f0e-a23b-237ab821410f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.935505 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkl9h\" (UniqueName: \"kubernetes.io/projected/b267a865-1a03-4f37-9d2a-83380d30da1d-kube-api-access-dkl9h\") pod \"cert-manager-858654f9db-fsdjh\" (UID: \"b267a865-1a03-4f37-9d2a-83380d30da1d\") " pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:10 crc kubenswrapper[4898]: I0313 14:10:10.935749 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rj5r\" (UniqueName: \"kubernetes.io/projected/7c1fa9c0-bb2e-4806-95fd-07fba426bdc8-kube-api-access-6rj5r\") pod \"cert-manager-webhook-687f57d79b-cx9pw\" (UID: \"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.038268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rj5r\" (UniqueName: \"kubernetes.io/projected/7c1fa9c0-bb2e-4806-95fd-07fba426bdc8-kube-api-access-6rj5r\") pod \"cert-manager-webhook-687f57d79b-cx9pw\" (UID: \"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.038422 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njm7l\" (UniqueName: \"kubernetes.io/projected/d00b7135-a080-4f0e-a23b-237ab821410f-kube-api-access-njm7l\") pod \"cert-manager-cainjector-cf98fcc89-krzxz\" (UID: \"d00b7135-a080-4f0e-a23b-237ab821410f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.038477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkl9h\" (UniqueName: \"kubernetes.io/projected/b267a865-1a03-4f37-9d2a-83380d30da1d-kube-api-access-dkl9h\") pod \"cert-manager-858654f9db-fsdjh\" (UID: \"b267a865-1a03-4f37-9d2a-83380d30da1d\") " pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.057985 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njm7l\" (UniqueName: \"kubernetes.io/projected/d00b7135-a080-4f0e-a23b-237ab821410f-kube-api-access-njm7l\") pod \"cert-manager-cainjector-cf98fcc89-krzxz\" (UID: \"d00b7135-a080-4f0e-a23b-237ab821410f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.058859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rj5r\" (UniqueName: \"kubernetes.io/projected/7c1fa9c0-bb2e-4806-95fd-07fba426bdc8-kube-api-access-6rj5r\") pod \"cert-manager-webhook-687f57d79b-cx9pw\" (UID: \"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.065025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkl9h\" (UniqueName: \"kubernetes.io/projected/b267a865-1a03-4f37-9d2a-83380d30da1d-kube-api-access-dkl9h\") pod \"cert-manager-858654f9db-fsdjh\" (UID: \"b267a865-1a03-4f37-9d2a-83380d30da1d\") " pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.141333 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.156801 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fsdjh" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.169352 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.417124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-krzxz"] Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.533878 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cx9pw"] Mar 13 14:10:11 crc kubenswrapper[4898]: W0313 14:10:11.687093 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb267a865_1a03_4f37_9d2a_83380d30da1d.slice/crio-1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53 WatchSource:0}: Error finding container 1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53: Status 404 returned error can't find the container with id 1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53 Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.691266 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fsdjh"] Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.829569 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fsdjh" event={"ID":"b267a865-1a03-4f37-9d2a-83380d30da1d","Type":"ContainerStarted","Data":"1f1cc2f977f39cfba7be977819af837e3fdc046c2d70bb3689fff2da8ae75f53"} Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.830707 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" event={"ID":"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8","Type":"ContainerStarted","Data":"8585a6e041128cc6364696bdcc7b704ae49c2d814eb51c69f1bee9d7d2485fa0"} Mar 13 14:10:11 crc kubenswrapper[4898]: I0313 14:10:11.831762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" event={"ID":"d00b7135-a080-4f0e-a23b-237ab821410f","Type":"ContainerStarted","Data":"1ccd14bd98922ce946bdcfd118dac94fa47d6c35a30a343750095db805780f83"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.871713 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" event={"ID":"d00b7135-a080-4f0e-a23b-237ab821410f","Type":"ContainerStarted","Data":"ae5facfb37b80f172262f70a7db74c6be7337536065ad2951208a9ec507e08f9"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.873023 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fsdjh" event={"ID":"b267a865-1a03-4f37-9d2a-83380d30da1d","Type":"ContainerStarted","Data":"bdb365e9cc950e99a6090096237536f256ed40a6c8240745e58c48ae8456e404"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.874698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" event={"ID":"7c1fa9c0-bb2e-4806-95fd-07fba426bdc8","Type":"ContainerStarted","Data":"cd3ee2c415f806671eb3ae90112b21e024a77788831d4c74c6503a512654d1e6"} Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.874839 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.909524 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podStartSLOduration=2.310198539 podStartE2EDuration="5.909494223s" podCreationTimestamp="2026-03-13 14:10:10 +0000 UTC" firstStartedPulling="2026-03-13 14:10:11.546091143 +0000 UTC m=+846.547679392" lastFinishedPulling="2026-03-13 14:10:15.145386827 +0000 UTC m=+850.146975076" observedRunningTime="2026-03-13 14:10:15.907215304 +0000 UTC m=+850.908803563" watchObservedRunningTime="2026-03-13 14:10:15.909494223 +0000 UTC m=+850.911082462" Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.913812 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-krzxz" podStartSLOduration=2.921365227 podStartE2EDuration="5.913797736s" podCreationTimestamp="2026-03-13 14:10:10 +0000 UTC" firstStartedPulling="2026-03-13 14:10:11.431264201 +0000 UTC m=+846.432852440" lastFinishedPulling="2026-03-13 14:10:14.4236967 +0000 UTC m=+849.425284949" observedRunningTime="2026-03-13 14:10:15.886302867 +0000 UTC m=+850.887891126" watchObservedRunningTime="2026-03-13 14:10:15.913797736 +0000 UTC m=+850.915385975" Mar 13 14:10:15 crc kubenswrapper[4898]: I0313 14:10:15.937290 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fsdjh" podStartSLOduration=2.400737166 podStartE2EDuration="5.937273119s" podCreationTimestamp="2026-03-13 14:10:10 +0000 UTC" firstStartedPulling="2026-03-13 14:10:11.689299477 +0000 UTC m=+846.690887716" lastFinishedPulling="2026-03-13 14:10:15.22583543 +0000 UTC m=+850.227423669" observedRunningTime="2026-03-13 14:10:15.935652907 +0000 UTC m=+850.937241146" watchObservedRunningTime="2026-03-13 14:10:15.937273119 +0000 UTC m=+850.938861358" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.134565 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.134881 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.134939 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.135603 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.135673 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87" gracePeriod=600 Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.342750 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.920918 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87" exitCode=0 Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.920968 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87"} Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.921000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6"} Mar 13 14:10:19 crc kubenswrapper[4898]: I0313 14:10:19.921024 4898 scope.go:117] "RemoveContainer" containerID="87afe240e3b86dba51997a01c599db519fabde9560e41dee3b537bab350f3092" Mar 13 14:10:21 crc kubenswrapper[4898]: I0313 14:10:21.171164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 14:10:32 crc kubenswrapper[4898]: I0313 14:10:32.382302 4898 scope.go:117] "RemoveContainer" containerID="59d89d033eeab55992b3ec00208c3b5cec577e8de3d6a36471e4a08df49334b0" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.842501 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls"] Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.844634 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.848651 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.856007 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls"] Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.900923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.901203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:42 crc kubenswrapper[4898]: I0313 14:10:42.901234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002154 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002620 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.002738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.021553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.170248 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.252703 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q"] Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.254189 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.263413 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q"] Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.307013 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.307332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.307524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.408587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.408673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.408712 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.409148 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.409173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.426595 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.577587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.600622 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls"] Mar 13 14:10:43 crc kubenswrapper[4898]: W0313 14:10:43.603965 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964a321b_4be6_444e_8c20_3fc586008da7.slice/crio-7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7 WatchSource:0}: Error finding container 7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7: Status 404 returned error can't find the container with id 7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7 Mar 13 14:10:43 crc kubenswrapper[4898]: I0313 14:10:43.801175 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q"] Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.111593 4898 generic.go:334] "Generic (PLEG): container finished" podID="964a321b-4be6-444e-8c20-3fc586008da7" containerID="f40a97a6c9bc482e22eb5f7a44f7d05df46612ccdb769024461402ccaac5bfc7" exitCode=0 Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.111694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"f40a97a6c9bc482e22eb5f7a44f7d05df46612ccdb769024461402ccaac5bfc7"} Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.111729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerStarted","Data":"7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7"} Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.116214 4898 generic.go:334] "Generic (PLEG): container finished" podID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerID="65369742e289128b731e475df8f6329c63c8ba84f236a7ee566b63b276731bf3" exitCode=0 Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.116270 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"65369742e289128b731e475df8f6329c63c8ba84f236a7ee566b63b276731bf3"} Mar 13 14:10:44 crc kubenswrapper[4898]: I0313 14:10:44.116306 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerStarted","Data":"5645baba009813ce27646233637e36e5d24b1130e7a1c6181af39f5864ede086"} Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.132268 4898 generic.go:334] "Generic (PLEG): container finished" podID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerID="36862814fe1f0cff6a24426793b563f39aa7810a1ced2cae484a35cfc03c21ba" exitCode=0 Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.132826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"36862814fe1f0cff6a24426793b563f39aa7810a1ced2cae484a35cfc03c21ba"} Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.137176 4898 generic.go:334] "Generic (PLEG): container finished" podID="964a321b-4be6-444e-8c20-3fc586008da7" containerID="2be8f772924496c9e9df01ab97a8584fca928e05b67506e23db047d5b801b2d5" exitCode=0 Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.137238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"2be8f772924496c9e9df01ab97a8584fca928e05b67506e23db047d5b801b2d5"} Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.598736 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.601124 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.610395 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.655934 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.656352 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.656511 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.757666 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.757744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.757799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.758231 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.758416 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:46 crc kubenswrapper[4898]: I0313 14:10:46.779463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"redhat-operators-22tbp\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.003178 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.145304 4898 generic.go:334] "Generic (PLEG): container finished" podID="964a321b-4be6-444e-8c20-3fc586008da7" containerID="46895da33872cbf4500006c61e182af881c3f4ab68348bff7bd9f862e9008de2" exitCode=0 Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.145369 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"46895da33872cbf4500006c61e182af881c3f4ab68348bff7bd9f862e9008de2"} Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.148941 4898 generic.go:334] "Generic (PLEG): container finished" podID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerID="8afda24d2b140242bb216f07c14a1b8110a2b19a062c69dd7a28423dd8517e0f" exitCode=0 Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.149180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"8afda24d2b140242bb216f07c14a1b8110a2b19a062c69dd7a28423dd8517e0f"} Mar 13 14:10:47 crc kubenswrapper[4898]: I0313 14:10:47.276837 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:10:47 crc kubenswrapper[4898]: W0313 14:10:47.281495 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5de1b020_889c_4fb0_b067_cdeb543f0b64.slice/crio-249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489 WatchSource:0}: Error finding container 249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489: Status 404 returned error can't find the container with id 249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489 Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.155676 4898 generic.go:334] "Generic (PLEG): container finished" podID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" exitCode=0 Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.156650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead"} Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.156673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerStarted","Data":"249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489"} Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.467576 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.472733 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486186 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") pod \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") pod \"964a321b-4be6-444e-8c20-3fc586008da7\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486289 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") pod \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486355 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") pod \"964a321b-4be6-444e-8c20-3fc586008da7\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486387 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") pod \"964a321b-4be6-444e-8c20-3fc586008da7\" (UID: \"964a321b-4be6-444e-8c20-3fc586008da7\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.486409 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") pod \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\" (UID: \"e0f45cf5-8d8f-472b-87f5-64e5c8192622\") " Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.487337 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle" (OuterVolumeSpecName: "bundle") pod "964a321b-4be6-444e-8c20-3fc586008da7" (UID: "964a321b-4be6-444e-8c20-3fc586008da7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.488558 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.489215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle" (OuterVolumeSpecName: "bundle") pod "e0f45cf5-8d8f-472b-87f5-64e5c8192622" (UID: "e0f45cf5-8d8f-472b-87f5-64e5c8192622"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.495359 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7" (OuterVolumeSpecName: "kube-api-access-5kmk7") pod "e0f45cf5-8d8f-472b-87f5-64e5c8192622" (UID: "e0f45cf5-8d8f-472b-87f5-64e5c8192622"). InnerVolumeSpecName "kube-api-access-5kmk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.502445 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util" (OuterVolumeSpecName: "util") pod "964a321b-4be6-444e-8c20-3fc586008da7" (UID: "964a321b-4be6-444e-8c20-3fc586008da7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.506265 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util" (OuterVolumeSpecName: "util") pod "e0f45cf5-8d8f-472b-87f5-64e5c8192622" (UID: "e0f45cf5-8d8f-472b-87f5-64e5c8192622"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.542163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg" (OuterVolumeSpecName: "kube-api-access-vkhfg") pod "964a321b-4be6-444e-8c20-3fc586008da7" (UID: "964a321b-4be6-444e-8c20-3fc586008da7"). InnerVolumeSpecName "kube-api-access-vkhfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590433 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkhfg\" (UniqueName: \"kubernetes.io/projected/964a321b-4be6-444e-8c20-3fc586008da7-kube-api-access-vkhfg\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590680 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590743 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0f45cf5-8d8f-472b-87f5-64e5c8192622-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590801 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/964a321b-4be6-444e-8c20-3fc586008da7-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:48 crc kubenswrapper[4898]: I0313 14:10:48.590857 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kmk7\" (UniqueName: \"kubernetes.io/projected/e0f45cf5-8d8f-472b-87f5-64e5c8192622-kube-api-access-5kmk7\") on node \"crc\" DevicePath \"\"" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.165972 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.165993 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls" event={"ID":"964a321b-4be6-444e-8c20-3fc586008da7","Type":"ContainerDied","Data":"7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7"} Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.166046 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e17dec2ad4d7267ad062316a6b30662c69ae0cc24b25bd895d7c515a6ce4cd7" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.168097 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" event={"ID":"e0f45cf5-8d8f-472b-87f5-64e5c8192622","Type":"ContainerDied","Data":"5645baba009813ce27646233637e36e5d24b1130e7a1c6181af39f5864ede086"} Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.168131 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5645baba009813ce27646233637e36e5d24b1130e7a1c6181af39f5864ede086" Mar 13 14:10:49 crc kubenswrapper[4898]: I0313 14:10:49.168211 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q" Mar 13 14:10:50 crc kubenswrapper[4898]: I0313 14:10:50.175834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerStarted","Data":"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7"} Mar 13 14:10:51 crc kubenswrapper[4898]: I0313 14:10:51.197282 4898 generic.go:334] "Generic (PLEG): container finished" podID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" exitCode=0 Mar 13 14:10:51 crc kubenswrapper[4898]: I0313 14:10:51.197350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7"} Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.101755 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-bcn59"] Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102602 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102618 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102631 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102639 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102653 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102661 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102678 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102685 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102695 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102703 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="util" Mar 13 14:10:53 crc kubenswrapper[4898]: E0313 14:10:53.102716 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102725 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="pull" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102875 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="964a321b-4be6-444e-8c20-3fc586008da7" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.102913 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0f45cf5-8d8f-472b-87f5-64e5c8192622" containerName="extract" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.103460 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.105177 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-t77xd" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.105785 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.107607 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.122275 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-bcn59"] Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.160551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-559mq\" (UniqueName: \"kubernetes.io/projected/c5cfd1be-ede5-4678-99c5-17f232b97d81-kube-api-access-559mq\") pod \"cluster-logging-operator-66689c4bbf-bcn59\" (UID: \"c5cfd1be-ede5-4678-99c5-17f232b97d81\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.214884 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerStarted","Data":"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4"} Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.246262 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-22tbp" podStartSLOduration=3.443707631 podStartE2EDuration="7.246239269s" podCreationTimestamp="2026-03-13 14:10:46 +0000 UTC" firstStartedPulling="2026-03-13 14:10:48.157726282 +0000 UTC m=+883.159314521" lastFinishedPulling="2026-03-13 14:10:51.96025791 +0000 UTC m=+886.961846159" observedRunningTime="2026-03-13 14:10:53.241828564 +0000 UTC m=+888.243416823" watchObservedRunningTime="2026-03-13 14:10:53.246239269 +0000 UTC m=+888.247827508" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.261328 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-559mq\" (UniqueName: \"kubernetes.io/projected/c5cfd1be-ede5-4678-99c5-17f232b97d81-kube-api-access-559mq\") pod \"cluster-logging-operator-66689c4bbf-bcn59\" (UID: \"c5cfd1be-ede5-4678-99c5-17f232b97d81\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.279427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-559mq\" (UniqueName: \"kubernetes.io/projected/c5cfd1be-ede5-4678-99c5-17f232b97d81-kube-api-access-559mq\") pod \"cluster-logging-operator-66689c4bbf-bcn59\" (UID: \"c5cfd1be-ede5-4678-99c5-17f232b97d81\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.426726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" Mar 13 14:10:53 crc kubenswrapper[4898]: I0313 14:10:53.830499 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-bcn59"] Mar 13 14:10:53 crc kubenswrapper[4898]: W0313 14:10:53.835490 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cfd1be_ede5_4678_99c5_17f232b97d81.slice/crio-e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca WatchSource:0}: Error finding container e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca: Status 404 returned error can't find the container with id e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca Mar 13 14:10:54 crc kubenswrapper[4898]: I0313 14:10:54.221627 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" event={"ID":"c5cfd1be-ede5-4678-99c5-17f232b97d81","Type":"ContainerStarted","Data":"e11b7a347d9bfe96231e8aa73f69aa352ae98df446aa55c531163226ed42dfca"} Mar 13 14:10:57 crc kubenswrapper[4898]: I0313 14:10:57.003527 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:57 crc kubenswrapper[4898]: I0313 14:10:57.004036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:10:58 crc kubenswrapper[4898]: I0313 14:10:58.047165 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-22tbp" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" probeResult="failure" output=< Mar 13 14:10:58 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:10:58 crc kubenswrapper[4898]: > Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.788147 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8"] Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.815287 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820304 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-45rjv" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820485 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820588 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820748 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820856 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.820951 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.827680 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8"] Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2cd05b5b-32da-4560-a761-72221b99e2c6-manager-config\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927304 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-apiservice-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927335 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvklx\" (UniqueName: \"kubernetes.io/projected/2cd05b5b-32da-4560-a761-72221b99e2c6-kube-api-access-vvklx\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-webhook-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:02 crc kubenswrapper[4898]: I0313 14:11:02.927422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2cd05b5b-32da-4560-a761-72221b99e2c6-manager-config\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028649 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-apiservice-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvklx\" (UniqueName: \"kubernetes.io/projected/2cd05b5b-32da-4560-a761-72221b99e2c6-kube-api-access-vvklx\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-webhook-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.028772 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.029849 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/2cd05b5b-32da-4560-a761-72221b99e2c6-manager-config\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.035161 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.035962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-webhook-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.036415 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cd05b5b-32da-4560-a761-72221b99e2c6-apiservice-cert\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.061547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvklx\" (UniqueName: \"kubernetes.io/projected/2cd05b5b-32da-4560-a761-72221b99e2c6-kube-api-access-vvklx\") pod \"loki-operator-controller-manager-5fb555ff84-j52b8\" (UID: \"2cd05b5b-32da-4560-a761-72221b99e2c6\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.133790 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:03 crc kubenswrapper[4898]: I0313 14:11:03.592655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8"] Mar 13 14:11:03 crc kubenswrapper[4898]: W0313 14:11:03.612206 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cd05b5b_32da_4560_a761_72221b99e2c6.slice/crio-6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0 WatchSource:0}: Error finding container 6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0: Status 404 returned error can't find the container with id 6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0 Mar 13 14:11:04 crc kubenswrapper[4898]: I0313 14:11:04.300273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" event={"ID":"2cd05b5b-32da-4560-a761-72221b99e2c6","Type":"ContainerStarted","Data":"6b2feeaff613f0c9ee548ef7838db98cfa828fbf888348fd720a846c579b54e0"} Mar 13 14:11:07 crc kubenswrapper[4898]: I0313 14:11:07.053950 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:07 crc kubenswrapper[4898]: I0313 14:11:07.099813 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.386821 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.387293 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-22tbp" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" containerID="cri-o://4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" gracePeriod=2 Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.855008 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.931760 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") pod \"5de1b020-889c-4fb0-b067-cdeb543f0b64\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.931941 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") pod \"5de1b020-889c-4fb0-b067-cdeb543f0b64\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.932015 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") pod \"5de1b020-889c-4fb0-b067-cdeb543f0b64\" (UID: \"5de1b020-889c-4fb0-b067-cdeb543f0b64\") " Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.932718 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities" (OuterVolumeSpecName: "utilities") pod "5de1b020-889c-4fb0-b067-cdeb543f0b64" (UID: "5de1b020-889c-4fb0-b067-cdeb543f0b64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:11:09 crc kubenswrapper[4898]: I0313 14:11:09.945754 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c" (OuterVolumeSpecName: "kube-api-access-tl22c") pod "5de1b020-889c-4fb0-b067-cdeb543f0b64" (UID: "5de1b020-889c-4fb0-b067-cdeb543f0b64"). InnerVolumeSpecName "kube-api-access-tl22c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.034160 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl22c\" (UniqueName: \"kubernetes.io/projected/5de1b020-889c-4fb0-b067-cdeb543f0b64-kube-api-access-tl22c\") on node \"crc\" DevicePath \"\"" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.034198 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.089747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5de1b020-889c-4fb0-b067-cdeb543f0b64" (UID: "5de1b020-889c-4fb0-b067-cdeb543f0b64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.134932 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5de1b020-889c-4fb0-b067-cdeb543f0b64-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351445 4898 generic.go:334] "Generic (PLEG): container finished" podID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" exitCode=0 Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351523 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-22tbp" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4"} Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-22tbp" event={"ID":"5de1b020-889c-4fb0-b067-cdeb543f0b64","Type":"ContainerDied","Data":"249664907f9e2e0549b46b4ce9e51616428d129694559809cf7f3f144f337489"} Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.351988 4898 scope.go:117] "RemoveContainer" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.358472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" event={"ID":"c5cfd1be-ede5-4678-99c5-17f232b97d81","Type":"ContainerStarted","Data":"1c9aab7f14f3dfc6bc9ae40ad727b38c16bd058db7a4f49a41298a530cfaec69"} Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.390010 4898 scope.go:117] "RemoveContainer" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.418785 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-bcn59" podStartSLOduration=1.766979964 podStartE2EDuration="17.418764282s" podCreationTimestamp="2026-03-13 14:10:53 +0000 UTC" firstStartedPulling="2026-03-13 14:10:53.839092447 +0000 UTC m=+888.840680686" lastFinishedPulling="2026-03-13 14:11:09.490876765 +0000 UTC m=+904.492465004" observedRunningTime="2026-03-13 14:11:10.396333796 +0000 UTC m=+905.397922045" watchObservedRunningTime="2026-03-13 14:11:10.418764282 +0000 UTC m=+905.420352541" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.422256 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.440715 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-22tbp"] Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.453218 4898 scope.go:117] "RemoveContainer" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.507182 4898 scope.go:117] "RemoveContainer" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" Mar 13 14:11:10 crc kubenswrapper[4898]: E0313 14:11:10.510806 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4\": container with ID starting with 4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4 not found: ID does not exist" containerID="4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.510846 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4"} err="failed to get container status \"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4\": rpc error: code = NotFound desc = could not find container \"4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4\": container with ID starting with 4615e570e10fb8e73d204f621789c7f2db42ba41d4ea79361fccac7790ba3df4 not found: ID does not exist" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.510867 4898 scope.go:117] "RemoveContainer" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" Mar 13 14:11:10 crc kubenswrapper[4898]: E0313 14:11:10.511666 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7\": container with ID starting with 8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7 not found: ID does not exist" containerID="8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.511713 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7"} err="failed to get container status \"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7\": rpc error: code = NotFound desc = could not find container \"8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7\": container with ID starting with 8a662e914e1ef5840c7d25299683174356269f87fbe4400915a9153e947174c7 not found: ID does not exist" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.511758 4898 scope.go:117] "RemoveContainer" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" Mar 13 14:11:10 crc kubenswrapper[4898]: E0313 14:11:10.513847 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead\": container with ID starting with ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead not found: ID does not exist" containerID="ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead" Mar 13 14:11:10 crc kubenswrapper[4898]: I0313 14:11:10.513875 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead"} err="failed to get container status \"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead\": rpc error: code = NotFound desc = could not find container \"ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead\": container with ID starting with ffa5f5636f6ad5a5d8c0a7e72c6f19e3edb102f03e8614a36057cf6da4b4aead not found: ID does not exist" Mar 13 14:11:11 crc kubenswrapper[4898]: I0313 14:11:11.746697 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" path="/var/lib/kubelet/pods/5de1b020-889c-4fb0-b067-cdeb543f0b64/volumes" Mar 13 14:11:13 crc kubenswrapper[4898]: I0313 14:11:13.380592 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" event={"ID":"2cd05b5b-32da-4560-a761-72221b99e2c6","Type":"ContainerStarted","Data":"a617f169f1dfdd2506afa41a7275d981d2a519aa1e27fbcd8c4664f8adc494fa"} Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.437075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" event={"ID":"2cd05b5b-32da-4560-a761-72221b99e2c6","Type":"ContainerStarted","Data":"f4b6459bd7650966303989bbf78aca39fcd321d5efb09073863a0f765a6e9309"} Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.439292 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.441826 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" Mar 13 14:11:21 crc kubenswrapper[4898]: I0313 14:11:21.495022 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podStartSLOduration=2.413564004 podStartE2EDuration="19.49500316s" podCreationTimestamp="2026-03-13 14:11:02 +0000 UTC" firstStartedPulling="2026-03-13 14:11:03.616515314 +0000 UTC m=+898.618103553" lastFinishedPulling="2026-03-13 14:11:20.69795445 +0000 UTC m=+915.699542709" observedRunningTime="2026-03-13 14:11:21.483248547 +0000 UTC m=+916.484836796" watchObservedRunningTime="2026-03-13 14:11:21.49500316 +0000 UTC m=+916.496591399" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.315953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 13 14:11:26 crc kubenswrapper[4898]: E0313 14:11:26.317071 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317143 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" Mar 13 14:11:26 crc kubenswrapper[4898]: E0313 14:11:26.317207 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-utilities" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317218 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-utilities" Mar 13 14:11:26 crc kubenswrapper[4898]: E0313 14:11:26.317237 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-content" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317248 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="extract-content" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.317590 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de1b020-889c-4fb0-b067-cdeb543f0b64" containerName="registry-server" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.319456 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.322378 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.322766 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.327986 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.494740 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.494894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gt9v\" (UniqueName: \"kubernetes.io/projected/9ce14c7d-1e04-483e-9738-ae3c512ef76f-kube-api-access-6gt9v\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.596748 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.596851 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gt9v\" (UniqueName: \"kubernetes.io/projected/9ce14c7d-1e04-483e-9738-ae3c512ef76f-kube-api-access-6gt9v\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.601809 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.601879 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/14e71af032d1929f261b00fb9b63ff890d85884402f51a4b1beb60df0fe69582/globalmount\"" pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.634645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gt9v\" (UniqueName: \"kubernetes.io/projected/9ce14c7d-1e04-483e-9738-ae3c512ef76f-kube-api-access-6gt9v\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.648126 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0d82100-9843-444c-8945-e270e6e1491b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d82100-9843-444c-8945-e270e6e1491b\") pod \"minio\" (UID: \"9ce14c7d-1e04-483e-9738-ae3c512ef76f\") " pod="minio-dev/minio" Mar 13 14:11:26 crc kubenswrapper[4898]: I0313 14:11:26.939323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 13 14:11:27 crc kubenswrapper[4898]: I0313 14:11:27.177879 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 13 14:11:27 crc kubenswrapper[4898]: W0313 14:11:27.182099 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ce14c7d_1e04_483e_9738_ae3c512ef76f.slice/crio-103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c WatchSource:0}: Error finding container 103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c: Status 404 returned error can't find the container with id 103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c Mar 13 14:11:27 crc kubenswrapper[4898]: I0313 14:11:27.482340 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9ce14c7d-1e04-483e-9738-ae3c512ef76f","Type":"ContainerStarted","Data":"103eef127bf515cf8543f30dfb93161b72d04fe01b3826bceca3334bcde0096c"} Mar 13 14:11:30 crc kubenswrapper[4898]: I0313 14:11:30.507784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"9ce14c7d-1e04-483e-9738-ae3c512ef76f","Type":"ContainerStarted","Data":"0e85cb6a69d039542f475b1ae85442734bc5fe33fa28ac8903dbcf3ab518a878"} Mar 13 14:11:30 crc kubenswrapper[4898]: I0313 14:11:30.528518 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.461573106 podStartE2EDuration="6.528500602s" podCreationTimestamp="2026-03-13 14:11:24 +0000 UTC" firstStartedPulling="2026-03-13 14:11:27.183920212 +0000 UTC m=+922.185508451" lastFinishedPulling="2026-03-13 14:11:30.250847708 +0000 UTC m=+925.252435947" observedRunningTime="2026-03-13 14:11:30.522773675 +0000 UTC m=+925.524361914" watchObservedRunningTime="2026-03-13 14:11:30.528500602 +0000 UTC m=+925.530088841" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.275693 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-vvj56"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.278710 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281438 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281449 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281689 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281635 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-gqrj6" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.281687 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.291502 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-vvj56"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.414403 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.415285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.418335 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.418555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.427203 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.427351 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448696 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cl5z\" (UniqueName: \"kubernetes.io/projected/510657b4-32e2-4fa5-9c09-17869a295736-kube-api-access-9cl5z\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.448797 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-config\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.486925 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.489807 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.493233 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.493688 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.543614 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550073 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnk9\" (UniqueName: \"kubernetes.io/projected/5e81d88f-c63b-4f0c-ba17-f1171350c28d-kube-api-access-9rnk9\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550387 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-config\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550469 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-config\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550514 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550654 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550721 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cl5z\" (UniqueName: \"kubernetes.io/projected/510657b4-32e2-4fa5-9c09-17869a295736-kube-api-access-9cl5z\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.550793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.551327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.551539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/510657b4-32e2-4fa5-9c09-17869a295736-config\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.555799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.556237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/510657b4-32e2-4fa5-9c09-17869a295736-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.579702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cl5z\" (UniqueName: \"kubernetes.io/projected/510657b4-32e2-4fa5-9c09-17869a295736-kube-api-access-9cl5z\") pod \"logging-loki-distributor-9c6b6d984-vvj56\" (UID: \"510657b4-32e2-4fa5-9c09-17869a295736\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.604185 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.606192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.606226 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.609516 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.609940 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.610031 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.611082 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.613994 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.617735 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.619123 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.621130 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-56dfj" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.626109 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbxbn\" (UniqueName: \"kubernetes.io/projected/e519fed6-a687-4a01-a979-598e81122ad1-kube-api-access-dbxbn\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-config\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652397 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652432 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652458 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652488 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnk9\" (UniqueName: \"kubernetes.io/projected/5e81d88f-c63b-4f0c-ba17-f1171350c28d-kube-api-access-9rnk9\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652581 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-config\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.652600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.649890 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x"] Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.657769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.658285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e81d88f-c63b-4f0c-ba17-f1171350c28d-config\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.661408 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.664369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.677764 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnk9\" (UniqueName: \"kubernetes.io/projected/5e81d88f-c63b-4f0c-ba17-f1171350c28d-kube-api-access-9rnk9\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.679633 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/5e81d88f-c63b-4f0c-ba17-f1171350c28d-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-qr6bw\" (UID: \"5e81d88f-c63b-4f0c-ba17-f1171350c28d\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.737080 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-config\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753763 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tenants\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753789 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753826 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-rbac\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753855 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tenants\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-rbac\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753919 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753969 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.753984 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9b9d\" (UniqueName: \"kubernetes.io/projected/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-kube-api-access-l9b9d\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754008 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754101 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6th\" (UniqueName: \"kubernetes.io/projected/077fcbe8-c497-44b4-82f9-ff8e317cbe83-kube-api-access-4z6th\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754118 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.754135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbxbn\" (UniqueName: \"kubernetes.io/projected/e519fed6-a687-4a01-a979-598e81122ad1-kube-api-access-dbxbn\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.756812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-config\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.757638 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.760077 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.778281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbxbn\" (UniqueName: \"kubernetes.io/projected/e519fed6-a687-4a01-a979-598e81122ad1-kube-api-access-dbxbn\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.790590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/e519fed6-a687-4a01-a979-598e81122ad1-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-mwqzz\" (UID: \"e519fed6-a687-4a01-a979-598e81122ad1\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.806327 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-rbac\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855715 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tenants\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855740 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-rbac\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855815 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9b9d\" (UniqueName: \"kubernetes.io/projected/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-kube-api-access-l9b9d\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855846 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855875 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855934 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6th\" (UniqueName: \"kubernetes.io/projected/077fcbe8-c497-44b4-82f9-ff8e317cbe83-kube-api-access-4z6th\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855963 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.855990 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856064 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tenants\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856122 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.856748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-rbac\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.857044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.857953 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.858461 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.858530 4898 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.858576 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret podName:077fcbe8-c497-44b4-82f9-ff8e317cbe83 nodeName:}" failed. No retries permitted until 2026-03-13 14:11:37.358561552 +0000 UTC m=+932.360149791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret") pod "logging-loki-gateway-c6d797ccf-9qh4r" (UID: "077fcbe8-c497-44b4-82f9-ff8e317cbe83") : secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.859869 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.860098 4898 secret.go:188] Couldn't get secret openshift-logging/logging-loki-gateway-http: secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: E0313 14:11:36.860436 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret podName:13ee53e6-2549-4dd8-91ac-80e4ef2c9d99 nodeName:}" failed. No retries permitted until 2026-03-13 14:11:37.36041552 +0000 UTC m=+932.362003779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-secret" (UniqueName: "kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret") pod "logging-loki-gateway-c6d797ccf-8ng9x" (UID: "13ee53e6-2549-4dd8-91ac-80e4ef2c9d99") : secret "logging-loki-gateway-http" not found Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.860725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.861528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-ca-bundle\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.862177 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-lokistack-gateway\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.862916 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/077fcbe8-c497-44b4-82f9-ff8e317cbe83-rbac\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.865183 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tenants\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.866399 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.870169 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tenants\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.892185 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6th\" (UniqueName: \"kubernetes.io/projected/077fcbe8-c497-44b4-82f9-ff8e317cbe83-kube-api-access-4z6th\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:36 crc kubenswrapper[4898]: I0313 14:11:36.892867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9b9d\" (UniqueName: \"kubernetes.io/projected/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-kube-api-access-l9b9d\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.154659 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-vvj56"] Mar 13 14:11:37 crc kubenswrapper[4898]: W0313 14:11:37.159138 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod510657b4_32e2_4fa5_9c09_17869a295736.slice/crio-588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34 WatchSource:0}: Error finding container 588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34: Status 404 returned error can't find the container with id 588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34 Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.222741 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw"] Mar 13 14:11:37 crc kubenswrapper[4898]: W0313 14:11:37.230031 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e81d88f_c63b_4f0c_ba17_f1171350c28d.slice/crio-506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68 WatchSource:0}: Error finding container 506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68: Status 404 returned error can't find the container with id 506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68 Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.271235 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz"] Mar 13 14:11:37 crc kubenswrapper[4898]: W0313 14:11:37.281271 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode519fed6_a687_4a01_a979_598e81122ad1.slice/crio-3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242 WatchSource:0}: Error finding container 3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242: Status 404 returned error can't find the container with id 3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242 Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.366240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.366350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.370873 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/13ee53e6-2549-4dd8-91ac-80e4ef2c9d99-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-8ng9x\" (UID: \"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.373647 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/077fcbe8-c497-44b4-82f9-ff8e317cbe83-tls-secret\") pod \"logging-loki-gateway-c6d797ccf-9qh4r\" (UID: \"077fcbe8-c497-44b4-82f9-ff8e317cbe83\") " pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.410272 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.411681 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.416085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.416669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.418394 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.464714 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.466405 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.468358 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.468457 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.489343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.531024 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.560220 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.561797 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.566802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.567087 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.569953 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570002 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570078 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570186 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-config\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570208 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570258 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xx5\" (UniqueName: \"kubernetes.io/projected/2194d847-4858-4f46-ab8b-c2d78cf5677e-kube-api-access-z7xx5\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpqdw\" (UniqueName: \"kubernetes.io/projected/9c5fee8d-2246-4e34-8ddd-ce710e155d73-kube-api-access-zpqdw\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570330 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-config\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570357 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.570391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.577670 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.582774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" event={"ID":"5e81d88f-c63b-4f0c-ba17-f1171350c28d","Type":"ContainerStarted","Data":"506b3c6c7a6fe4b77d437078eecd30760f329a4c67a298239690107de86bbf68"} Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.584537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" event={"ID":"510657b4-32e2-4fa5-9c09-17869a295736","Type":"ContainerStarted","Data":"588ef4ead368ab2e5638371f5bbdf6b639ba8c8fe67b8e61bb03ae9fc3e68a34"} Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.592537 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" event={"ID":"e519fed6-a687-4a01-a979-598e81122ad1","Type":"ContainerStarted","Data":"3ec77fb340706082cc184c315dd056ae13f90d153c10d3f4f8a73f9bcbd8a242"} Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.620258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-config\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673830 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673878 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673919 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.673983 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674010 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674035 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674111 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674142 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674166 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xx5\" (UniqueName: \"kubernetes.io/projected/2194d847-4858-4f46-ab8b-c2d78cf5677e-kube-api-access-z7xx5\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpqdw\" (UniqueName: \"kubernetes.io/projected/9c5fee8d-2246-4e34-8ddd-ce710e155d73-kube-api-access-zpqdw\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-config\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b865\" (UniqueName: \"kubernetes.io/projected/6a1df267-1145-4fe1-9455-57df3d043e3a-kube-api-access-9b865\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674338 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.674361 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.676194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-config\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.681422 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.681556 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-config\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.688549 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.689003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.689496 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.693721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.703881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/9c5fee8d-2246-4e34-8ddd-ce710e155d73-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.704476 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.704504 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26bedc4c1bfff6b9b760c8762a5570d09416afeb434fd9c5f3285e55a6b76a3f/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.705103 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.705127 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/016fef35710af0d853b02b8986f3c982d52ae12cf5f6a1b2c3955701e9084c35/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.706497 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.706521 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e38058b9a8e2fce17e58dc2f056548d59a67467bfbb80e5ac42f189464a94097/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.718741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.721560 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpqdw\" (UniqueName: \"kubernetes.io/projected/9c5fee8d-2246-4e34-8ddd-ce710e155d73-kube-api-access-zpqdw\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.726090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xx5\" (UniqueName: \"kubernetes.io/projected/2194d847-4858-4f46-ab8b-c2d78cf5677e-kube-api-access-z7xx5\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.736880 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/2194d847-4858-4f46-ab8b-c2d78cf5677e-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.778844 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.778994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779029 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b865\" (UniqueName: \"kubernetes.io/projected/6a1df267-1145-4fe1-9455-57df3d043e3a-kube-api-access-9b865\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.779178 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.781935 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.789590 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a1df267-1145-4fe1-9455-57df3d043e3a-config\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.793480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.832912 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.833666 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a1df267-1145-4fe1-9455-57df3d043e3a-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.838137 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.838180 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cd4ecb341920b75f201d3b77609b4adaa1255e86603e4072d1d6e42d9336ac62/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.838594 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b865\" (UniqueName: \"kubernetes.io/projected/6a1df267-1145-4fe1-9455-57df3d043e3a-kube-api-access-9b865\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.874628 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13c35950-5bd9-4551-b262-990db297da3e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13c35950-5bd9-4551-b262-990db297da3e\") pod \"logging-loki-compactor-0\" (UID: \"9c5fee8d-2246-4e34-8ddd-ce710e155d73\") " pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.879372 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49c190c1-aab1-4988-aa72-b6786999be24\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49c190c1-aab1-4988-aa72-b6786999be24\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.881864 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8606fd04-7f35-4896-8750-940576cee64d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8606fd04-7f35-4896-8750-940576cee64d\") pod \"logging-loki-index-gateway-0\" (UID: \"6a1df267-1145-4fe1-9455-57df3d043e3a\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.885659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15ed4531-bf9c-4d7a-a111-9dbf850db61b\") pod \"logging-loki-ingester-0\" (UID: \"2194d847-4858-4f46-ab8b-c2d78cf5677e\") " pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:37 crc kubenswrapper[4898]: I0313 14:11:37.941256 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.031334 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.088612 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.195091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r"] Mar 13 14:11:38 crc kubenswrapper[4898]: W0313 14:11:38.214585 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077fcbe8_c497_44b4_82f9_ff8e317cbe83.slice/crio-f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac WatchSource:0}: Error finding container f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac: Status 404 returned error can't find the container with id f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.249017 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x"] Mar 13 14:11:38 crc kubenswrapper[4898]: W0313 14:11:38.253010 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13ee53e6_2549_4dd8_91ac_80e4ef2c9d99.slice/crio-d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969 WatchSource:0}: Error finding container d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969: Status 404 returned error can't find the container with id d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969 Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.365045 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 13 14:11:38 crc kubenswrapper[4898]: W0313 14:11:38.372445 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1df267_1145_4fe1_9455_57df3d043e3a.slice/crio-1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10 WatchSource:0}: Error finding container 1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10: Status 404 returned error can't find the container with id 1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10 Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.474415 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.543025 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.601514 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"6a1df267-1145-4fe1-9455-57df3d043e3a","Type":"ContainerStarted","Data":"1083cfac66719ac52de747eb490bc373e33b01f1717b2da53d7b9fd6397a1a10"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.602845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"2194d847-4858-4f46-ab8b-c2d78cf5677e","Type":"ContainerStarted","Data":"ee1f39ee6f8150eb25864a4baa6d2a69f9e4a6977b91fc642c6e3a79f5142dc0"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.604058 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" event={"ID":"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99","Type":"ContainerStarted","Data":"d4e6f0db0fb92bf509209da909097bd4a98e043a22823219e25307febd47f969"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.605393 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"9c5fee8d-2246-4e34-8ddd-ce710e155d73","Type":"ContainerStarted","Data":"3bf1ce175d38c42a8ebc947669edd6e8fdc8bcff57ba456df7fabf96ebc626fb"} Mar 13 14:11:38 crc kubenswrapper[4898]: I0313 14:11:38.607113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" event={"ID":"077fcbe8-c497-44b4-82f9-ff8e317cbe83","Type":"ContainerStarted","Data":"f50e2c5a54d1997f4cda6ef4178ad487333556f8d2cda9de1e6ca463c19b00ac"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.627048 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"2194d847-4858-4f46-ab8b-c2d78cf5677e","Type":"ContainerStarted","Data":"51a34eecfff5bcd76e412488f2f647a11e51123d8bb67c5a49eb859212690fa7"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.627863 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.632713 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" event={"ID":"5e81d88f-c63b-4f0c-ba17-f1171350c28d","Type":"ContainerStarted","Data":"05de7a1630ecce8bb89db3fddb88fc8886f0a968022649e8dc17f53ced676a33"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.632800 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.642882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" event={"ID":"510657b4-32e2-4fa5-9c09-17869a295736","Type":"ContainerStarted","Data":"04c209ea08736554f11756ae1c12fd1213b662bea820893d317d4535954da8a3"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.643124 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.649032 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.041300067 podStartE2EDuration="4.649009235s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.490757501 +0000 UTC m=+933.492345740" lastFinishedPulling="2026-03-13 14:11:40.098466679 +0000 UTC m=+935.100054908" observedRunningTime="2026-03-13 14:11:40.648535903 +0000 UTC m=+935.650124162" watchObservedRunningTime="2026-03-13 14:11:40.649009235 +0000 UTC m=+935.650597474" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.654313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"9c5fee8d-2246-4e34-8ddd-ce710e155d73","Type":"ContainerStarted","Data":"b3c9648de0517f31a7ed9f100e721331593c713852257f57a4ccdacd68fa8784"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.654400 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.672673 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podStartSLOduration=1.7771506179999998 podStartE2EDuration="4.672655223s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:37.233092799 +0000 UTC m=+932.234681038" lastFinishedPulling="2026-03-13 14:11:40.128597404 +0000 UTC m=+935.130185643" observedRunningTime="2026-03-13 14:11:40.667019258 +0000 UTC m=+935.668607507" watchObservedRunningTime="2026-03-13 14:11:40.672655223 +0000 UTC m=+935.674243462" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.678370 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" event={"ID":"e519fed6-a687-4a01-a979-598e81122ad1","Type":"ContainerStarted","Data":"40d45b908d4063d63b06ce169c42a918c60d9804baaaf0dc0fc2eebdb0e61d6f"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.678452 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.680809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"6a1df267-1145-4fe1-9455-57df3d043e3a","Type":"ContainerStarted","Data":"d0a283a5eb4a2ba6a67b1232156e05c6727addae52bb5b5751da0673e217bf3a"} Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.681501 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.711009 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podStartSLOduration=1.782903647 podStartE2EDuration="4.7109897s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:37.161131208 +0000 UTC m=+932.162719447" lastFinishedPulling="2026-03-13 14:11:40.089217261 +0000 UTC m=+935.090805500" observedRunningTime="2026-03-13 14:11:40.708033324 +0000 UTC m=+935.709621583" watchObservedRunningTime="2026-03-13 14:11:40.7109897 +0000 UTC m=+935.712577949" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.738128 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.168368207 podStartE2EDuration="4.738107388s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.560013103 +0000 UTC m=+933.561601342" lastFinishedPulling="2026-03-13 14:11:40.129752284 +0000 UTC m=+935.131340523" observedRunningTime="2026-03-13 14:11:40.72849744 +0000 UTC m=+935.730085689" watchObservedRunningTime="2026-03-13 14:11:40.738107388 +0000 UTC m=+935.739695627" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.770217 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podStartSLOduration=1.926409959 podStartE2EDuration="4.770194883s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:37.283991819 +0000 UTC m=+932.285580068" lastFinishedPulling="2026-03-13 14:11:40.127776753 +0000 UTC m=+935.129364992" observedRunningTime="2026-03-13 14:11:40.745702053 +0000 UTC m=+935.747290312" watchObservedRunningTime="2026-03-13 14:11:40.770194883 +0000 UTC m=+935.771783122" Mar 13 14:11:40 crc kubenswrapper[4898]: I0313 14:11:40.788641 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.036833172 podStartE2EDuration="4.788618557s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.375387142 +0000 UTC m=+933.376975381" lastFinishedPulling="2026-03-13 14:11:40.127172527 +0000 UTC m=+935.128760766" observedRunningTime="2026-03-13 14:11:40.762839434 +0000 UTC m=+935.764427683" watchObservedRunningTime="2026-03-13 14:11:40.788618557 +0000 UTC m=+935.790206796" Mar 13 14:11:42 crc kubenswrapper[4898]: I0313 14:11:42.754661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" event={"ID":"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99","Type":"ContainerStarted","Data":"fcf8e9a8d3e5e7f7951cb2af14626b6e5da83135311a91da93925d8b17fce58e"} Mar 13 14:11:42 crc kubenswrapper[4898]: I0313 14:11:42.756958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" event={"ID":"077fcbe8-c497-44b4-82f9-ff8e317cbe83","Type":"ContainerStarted","Data":"ce5d448a85a6fe29bfd4df4184532deeda10f71ada8ffc4d7a6222cbefeeb36f"} Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.772468 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" event={"ID":"13ee53e6-2549-4dd8-91ac-80e4ef2c9d99","Type":"ContainerStarted","Data":"149addd5742530e63e0bc98feb66cfd6d0db4bf52fa1f9082406acdbde10302b"} Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.773148 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.773168 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.775428 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" event={"ID":"077fcbe8-c497-44b4-82f9-ff8e317cbe83","Type":"ContainerStarted","Data":"dfa779b5c6a3bc447019fc31e561acce75042dc2e0deee52bceb23bc194bec24"} Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.776063 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.785668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.787475 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.793022 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.804316 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podStartSLOduration=2.743102504 podStartE2EDuration="8.804295786s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.257445947 +0000 UTC m=+933.259034186" lastFinishedPulling="2026-03-13 14:11:44.318639229 +0000 UTC m=+939.320227468" observedRunningTime="2026-03-13 14:11:44.799292137 +0000 UTC m=+939.800880386" watchObservedRunningTime="2026-03-13 14:11:44.804295786 +0000 UTC m=+939.805884045" Mar 13 14:11:44 crc kubenswrapper[4898]: I0313 14:11:44.838678 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podStartSLOduration=2.744409508 podStartE2EDuration="8.83865849s" podCreationTimestamp="2026-03-13 14:11:36 +0000 UTC" firstStartedPulling="2026-03-13 14:11:38.218364502 +0000 UTC m=+933.219952731" lastFinishedPulling="2026-03-13 14:11:44.312613474 +0000 UTC m=+939.314201713" observedRunningTime="2026-03-13 14:11:44.834469822 +0000 UTC m=+939.836058081" watchObservedRunningTime="2026-03-13 14:11:44.83865849 +0000 UTC m=+939.840246729" Mar 13 14:11:45 crc kubenswrapper[4898]: I0313 14:11:45.784514 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:45 crc kubenswrapper[4898]: I0313 14:11:45.806418 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" Mar 13 14:11:56 crc kubenswrapper[4898]: I0313 14:11:56.611499 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" Mar 13 14:11:56 crc kubenswrapper[4898]: I0313 14:11:56.743468 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" Mar 13 14:11:56 crc kubenswrapper[4898]: I0313 14:11:56.816346 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" Mar 13 14:11:57 crc kubenswrapper[4898]: I0313 14:11:57.948912 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 13 14:11:58 crc kubenswrapper[4898]: I0313 14:11:58.041451 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 13 14:11:58 crc kubenswrapper[4898]: I0313 14:11:58.041505 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:11:58 crc kubenswrapper[4898]: I0313 14:11:58.095964 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.137434 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.139392 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.141824 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.142106 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.142628 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.144865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.261072 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"auto-csr-approver-29556852-z8l5q\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.362940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"auto-csr-approver-29556852-z8l5q\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.394131 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"auto-csr-approver-29556852-z8l5q\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.463597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.727930 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:12:00 crc kubenswrapper[4898]: W0313 14:12:00.731059 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda054881_deef_4491_9685_5f35ee9fc45f.slice/crio-f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703 WatchSource:0}: Error finding container f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703: Status 404 returned error can't find the container with id f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703 Mar 13 14:12:00 crc kubenswrapper[4898]: I0313 14:12:00.938481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" event={"ID":"da054881-deef-4491-9685-5f35ee9fc45f","Type":"ContainerStarted","Data":"f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703"} Mar 13 14:12:02 crc kubenswrapper[4898]: I0313 14:12:02.957115 4898 generic.go:334] "Generic (PLEG): container finished" podID="da054881-deef-4491-9685-5f35ee9fc45f" containerID="5010d4732869bc8d7e0532b7c193085ea8336efd3a5d0f8cd686f3caf758e4d9" exitCode=0 Mar 13 14:12:02 crc kubenswrapper[4898]: I0313 14:12:02.957167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" event={"ID":"da054881-deef-4491-9685-5f35ee9fc45f","Type":"ContainerDied","Data":"5010d4732869bc8d7e0532b7c193085ea8336efd3a5d0f8cd686f3caf758e4d9"} Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.307017 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.434455 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") pod \"da054881-deef-4491-9685-5f35ee9fc45f\" (UID: \"da054881-deef-4491-9685-5f35ee9fc45f\") " Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.441630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v" (OuterVolumeSpecName: "kube-api-access-6r75v") pod "da054881-deef-4491-9685-5f35ee9fc45f" (UID: "da054881-deef-4491-9685-5f35ee9fc45f"). InnerVolumeSpecName "kube-api-access-6r75v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.537172 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r75v\" (UniqueName: \"kubernetes.io/projected/da054881-deef-4491-9685-5f35ee9fc45f-kube-api-access-6r75v\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.975395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" event={"ID":"da054881-deef-4491-9685-5f35ee9fc45f","Type":"ContainerDied","Data":"f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703"} Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.975436 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7218a8006a20357a5e70b15e04df0fe0dbc1453185a2df0602bc419da843703" Mar 13 14:12:04 crc kubenswrapper[4898]: I0313 14:12:04.975509 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556852-z8l5q" Mar 13 14:12:05 crc kubenswrapper[4898]: I0313 14:12:05.385686 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:12:05 crc kubenswrapper[4898]: I0313 14:12:05.391753 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556846-d6zpp"] Mar 13 14:12:05 crc kubenswrapper[4898]: I0313 14:12:05.757860 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="666e4c5d-e464-4b8a-b167-bc7624fc3e10" path="/var/lib/kubelet/pods/666e4c5d-e464-4b8a-b167-bc7624fc3e10/volumes" Mar 13 14:12:08 crc kubenswrapper[4898]: I0313 14:12:08.040424 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 13 14:12:08 crc kubenswrapper[4898]: I0313 14:12:08.040501 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.067170 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.067855 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.877706 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:18 crc kubenswrapper[4898]: E0313 14:12:18.878874 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da054881-deef-4491-9685-5f35ee9fc45f" containerName="oc" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.878930 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="da054881-deef-4491-9685-5f35ee9fc45f" containerName="oc" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.879167 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="da054881-deef-4491-9685-5f35ee9fc45f" containerName="oc" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.881167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:18 crc kubenswrapper[4898]: I0313 14:12:18.885862 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.019532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.019998 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.020103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.121890 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122183 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122802 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.122887 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.134487 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.134577 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.154276 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"redhat-marketplace-x8rbg\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.209217 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:19 crc kubenswrapper[4898]: I0313 14:12:19.637350 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:20 crc kubenswrapper[4898]: I0313 14:12:20.105418 4898 generic.go:334] "Generic (PLEG): container finished" podID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" exitCode=0 Mar 13 14:12:20 crc kubenswrapper[4898]: I0313 14:12:20.105486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab"} Mar 13 14:12:20 crc kubenswrapper[4898]: I0313 14:12:20.105522 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerStarted","Data":"9b9efbda102cfb59931b3b2dfe9347c23bb3d5bb9a19770896f214f7234be802"} Mar 13 14:12:23 crc kubenswrapper[4898]: I0313 14:12:23.133664 4898 generic.go:334] "Generic (PLEG): container finished" podID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" exitCode=0 Mar 13 14:12:23 crc kubenswrapper[4898]: I0313 14:12:23.133777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93"} Mar 13 14:12:26 crc kubenswrapper[4898]: I0313 14:12:26.158501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerStarted","Data":"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf"} Mar 13 14:12:28 crc kubenswrapper[4898]: I0313 14:12:28.038120 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 13 14:12:28 crc kubenswrapper[4898]: I0313 14:12:28.038509 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.210455 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.210504 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.283656 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.320477 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x8rbg" podStartSLOduration=6.4356755 podStartE2EDuration="11.32045733s" podCreationTimestamp="2026-03-13 14:12:18 +0000 UTC" firstStartedPulling="2026-03-13 14:12:20.107315926 +0000 UTC m=+975.108904175" lastFinishedPulling="2026-03-13 14:12:24.992097726 +0000 UTC m=+979.993686005" observedRunningTime="2026-03-13 14:12:26.185429122 +0000 UTC m=+981.187017371" watchObservedRunningTime="2026-03-13 14:12:29.32045733 +0000 UTC m=+984.322045569" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.536675 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.538574 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.555627 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.703589 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.703708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.703949 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.806301 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.806366 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.806403 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.807113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.807128 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.830676 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"community-operators-8vtbq\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:29 crc kubenswrapper[4898]: I0313 14:12:29.857409 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:30 crc kubenswrapper[4898]: I0313 14:12:30.236511 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:30 crc kubenswrapper[4898]: I0313 14:12:30.389360 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:31 crc kubenswrapper[4898]: I0313 14:12:31.212380 4898 generic.go:334] "Generic (PLEG): container finished" podID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" exitCode=0 Mar 13 14:12:31 crc kubenswrapper[4898]: I0313 14:12:31.212610 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84"} Mar 13 14:12:31 crc kubenswrapper[4898]: I0313 14:12:31.212715 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerStarted","Data":"04b8094af8db214b2ee0dda013e626767a66d4ace84639cedc7f083561522032"} Mar 13 14:12:32 crc kubenswrapper[4898]: I0313 14:12:32.500867 4898 scope.go:117] "RemoveContainer" containerID="9c70e0bed8678da48508773f6b5163cca47cd975b196edd773fb1f955ef9672b" Mar 13 14:12:32 crc kubenswrapper[4898]: I0313 14:12:32.542448 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.230327 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x8rbg" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" containerID="cri-o://1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" gracePeriod=2 Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.647020 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.775377 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") pod \"715d729a-a993-4a3a-98a2-58f904ef7f6b\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.775454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") pod \"715d729a-a993-4a3a-98a2-58f904ef7f6b\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.775497 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") pod \"715d729a-a993-4a3a-98a2-58f904ef7f6b\" (UID: \"715d729a-a993-4a3a-98a2-58f904ef7f6b\") " Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.777295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities" (OuterVolumeSpecName: "utilities") pod "715d729a-a993-4a3a-98a2-58f904ef7f6b" (UID: "715d729a-a993-4a3a-98a2-58f904ef7f6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.782890 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c" (OuterVolumeSpecName: "kube-api-access-rvt8c") pod "715d729a-a993-4a3a-98a2-58f904ef7f6b" (UID: "715d729a-a993-4a3a-98a2-58f904ef7f6b"). InnerVolumeSpecName "kube-api-access-rvt8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.805757 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "715d729a-a993-4a3a-98a2-58f904ef7f6b" (UID: "715d729a-a993-4a3a-98a2-58f904ef7f6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.878622 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.878684 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvt8c\" (UniqueName: \"kubernetes.io/projected/715d729a-a993-4a3a-98a2-58f904ef7f6b-kube-api-access-rvt8c\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:33 crc kubenswrapper[4898]: I0313 14:12:33.878700 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/715d729a-a993-4a3a-98a2-58f904ef7f6b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240164 4898 generic.go:334] "Generic (PLEG): container finished" podID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" exitCode=0 Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240206 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf"} Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240235 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x8rbg" event={"ID":"715d729a-a993-4a3a-98a2-58f904ef7f6b","Type":"ContainerDied","Data":"9b9efbda102cfb59931b3b2dfe9347c23bb3d5bb9a19770896f214f7234be802"} Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240253 4898 scope.go:117] "RemoveContainer" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.240844 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x8rbg" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.272380 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.277497 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x8rbg"] Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.286632 4898 scope.go:117] "RemoveContainer" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.344316 4898 scope.go:117] "RemoveContainer" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.385178 4898 scope.go:117] "RemoveContainer" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" Mar 13 14:12:34 crc kubenswrapper[4898]: E0313 14:12:34.385736 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf\": container with ID starting with 1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf not found: ID does not exist" containerID="1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.385773 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf"} err="failed to get container status \"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf\": rpc error: code = NotFound desc = could not find container \"1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf\": container with ID starting with 1d57ef123c695ddf9c7483fec39ed71eba899163e636c0e64229cca48af10fdf not found: ID does not exist" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.385798 4898 scope.go:117] "RemoveContainer" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" Mar 13 14:12:34 crc kubenswrapper[4898]: E0313 14:12:34.386257 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93\": container with ID starting with 2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93 not found: ID does not exist" containerID="2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.386317 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93"} err="failed to get container status \"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93\": rpc error: code = NotFound desc = could not find container \"2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93\": container with ID starting with 2a25837fc0c0e8cd6200157bc14ddc8f0fab1783c71bf24d20c54a3a2768ca93 not found: ID does not exist" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.386368 4898 scope.go:117] "RemoveContainer" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" Mar 13 14:12:34 crc kubenswrapper[4898]: E0313 14:12:34.386735 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab\": container with ID starting with 03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab not found: ID does not exist" containerID="03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab" Mar 13 14:12:34 crc kubenswrapper[4898]: I0313 14:12:34.386810 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab"} err="failed to get container status \"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab\": rpc error: code = NotFound desc = could not find container \"03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab\": container with ID starting with 03e5d76c2ef48d4d76f4111af3bc7856f5e1b17ed603f9c78a2d13ffcf05edab not found: ID does not exist" Mar 13 14:12:35 crc kubenswrapper[4898]: I0313 14:12:35.251601 4898 generic.go:334] "Generic (PLEG): container finished" podID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" exitCode=0 Mar 13 14:12:35 crc kubenswrapper[4898]: I0313 14:12:35.251695 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7"} Mar 13 14:12:35 crc kubenswrapper[4898]: I0313 14:12:35.749961 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" path="/var/lib/kubelet/pods/715d729a-a993-4a3a-98a2-58f904ef7f6b/volumes" Mar 13 14:12:38 crc kubenswrapper[4898]: I0313 14:12:38.043139 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.298840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerStarted","Data":"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615"} Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.324355 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vtbq" podStartSLOduration=3.8433358970000002 podStartE2EDuration="10.324330541s" podCreationTimestamp="2026-03-13 14:12:29 +0000 UTC" firstStartedPulling="2026-03-13 14:12:31.215354757 +0000 UTC m=+986.216943026" lastFinishedPulling="2026-03-13 14:12:37.696349391 +0000 UTC m=+992.697937670" observedRunningTime="2026-03-13 14:12:39.316249313 +0000 UTC m=+994.317837562" watchObservedRunningTime="2026-03-13 14:12:39.324330541 +0000 UTC m=+994.325918810" Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.859039 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:39 crc kubenswrapper[4898]: I0313 14:12:39.875368 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:40 crc kubenswrapper[4898]: I0313 14:12:40.915273 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8vtbq" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:12:40 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:12:40 crc kubenswrapper[4898]: > Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.134968 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.135551 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.921728 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:49 crc kubenswrapper[4898]: I0313 14:12:49.976698 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:50 crc kubenswrapper[4898]: I0313 14:12:50.160570 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.400823 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vtbq" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" containerID="cri-o://a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" gracePeriod=2 Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.785436 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.897647 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") pod \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.897761 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") pod \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.897833 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") pod \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\" (UID: \"fd6314a3-ad6a-48ea-b54a-a2d1415b287e\") " Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.899161 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities" (OuterVolumeSpecName: "utilities") pod "fd6314a3-ad6a-48ea-b54a-a2d1415b287e" (UID: "fd6314a3-ad6a-48ea-b54a-a2d1415b287e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.902958 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb" (OuterVolumeSpecName: "kube-api-access-xmzfb") pod "fd6314a3-ad6a-48ea-b54a-a2d1415b287e" (UID: "fd6314a3-ad6a-48ea-b54a-a2d1415b287e"). InnerVolumeSpecName "kube-api-access-xmzfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:51 crc kubenswrapper[4898]: I0313 14:12:51.958417 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd6314a3-ad6a-48ea-b54a-a2d1415b287e" (UID: "fd6314a3-ad6a-48ea-b54a-a2d1415b287e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.000011 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.000047 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmzfb\" (UniqueName: \"kubernetes.io/projected/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-kube-api-access-xmzfb\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.000062 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd6314a3-ad6a-48ea-b54a-a2d1415b287e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412619 4898 generic.go:334] "Generic (PLEG): container finished" podID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" exitCode=0 Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412699 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vtbq" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615"} Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412924 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vtbq" event={"ID":"fd6314a3-ad6a-48ea-b54a-a2d1415b287e","Type":"ContainerDied","Data":"04b8094af8db214b2ee0dda013e626767a66d4ace84639cedc7f083561522032"} Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.412947 4898 scope.go:117] "RemoveContainer" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.435865 4898 scope.go:117] "RemoveContainer" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.470316 4898 scope.go:117] "RemoveContainer" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.485431 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.498966 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vtbq"] Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.501648 4898 scope.go:117] "RemoveContainer" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" Mar 13 14:12:52 crc kubenswrapper[4898]: E0313 14:12:52.502171 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615\": container with ID starting with a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615 not found: ID does not exist" containerID="a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502243 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615"} err="failed to get container status \"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615\": rpc error: code = NotFound desc = could not find container \"a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615\": container with ID starting with a2c7fc3c93c843330a8d6bb55cc7d70838af37cb6b7c3b8a25f9ff794013c615 not found: ID does not exist" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502284 4898 scope.go:117] "RemoveContainer" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" Mar 13 14:12:52 crc kubenswrapper[4898]: E0313 14:12:52.502730 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7\": container with ID starting with 55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7 not found: ID does not exist" containerID="55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502806 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7"} err="failed to get container status \"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7\": rpc error: code = NotFound desc = could not find container \"55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7\": container with ID starting with 55f2353b52acfcba2e9a9d07e6194d7265595a9010a025929275d655e6d8ecf7 not found: ID does not exist" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.502869 4898 scope.go:117] "RemoveContainer" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" Mar 13 14:12:52 crc kubenswrapper[4898]: E0313 14:12:52.503417 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84\": container with ID starting with b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84 not found: ID does not exist" containerID="b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84" Mar 13 14:12:52 crc kubenswrapper[4898]: I0313 14:12:52.503448 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84"} err="failed to get container status \"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84\": rpc error: code = NotFound desc = could not find container \"b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84\": container with ID starting with b5a3fa8a34341c83229a787919abeec42db4b594e78e4b2bca60e4579eec7b84 not found: ID does not exist" Mar 13 14:12:53 crc kubenswrapper[4898]: I0313 14:12:53.750695 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" path="/var/lib/kubelet/pods/fd6314a3-ad6a-48ea-b54a-a2d1415b287e/volumes" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331195 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331456 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331467 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331482 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331488 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331496 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331502 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331511 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331518 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-content" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331528 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331534 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.331551 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331556 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="extract-utilities" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331690 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd6314a3-ad6a-48ea-b54a-a2d1415b287e" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.331707 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="715d729a-a993-4a3a-98a2-58f904ef7f6b" containerName="registry-server" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.332238 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.336697 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.340721 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.340805 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.342017 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-6h2rk" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.342176 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.353728 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.365354 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.437031 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.445197 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.466165 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478711 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478756 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478872 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478924 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.478958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479002 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479041 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479062 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.479102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.511247 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.511791 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-rmldq metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-5nmqm" podUID="45b41ab9-a5cd-41ec-8714-9d13c0ca0550" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.580398 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.580497 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580667 4898 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580739 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver podName:45b41ab9-a5cd-41ec-8714-9d13c0ca0550 nodeName:}" failed. No retries permitted until 2026-03-13 14:12:56.080717577 +0000 UTC m=+1011.082305816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver") pod "collector-5nmqm" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550") : secret "collector-syslog-receiver" not found Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580760 4898 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Mar 13 14:12:55 crc kubenswrapper[4898]: E0313 14:12:55.580854 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics podName:45b41ab9-a5cd-41ec-8714-9d13c0ca0550 nodeName:}" failed. No retries permitted until 2026-03-13 14:12:56.08081698 +0000 UTC m=+1011.082405219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics") pod "collector-5nmqm" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550") : secret "collector-metrics" not found Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.581034 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.581932 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.581999 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582343 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582404 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582435 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582464 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582572 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.582645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.583239 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.583968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.584135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.585936 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.587361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.601698 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.605885 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.684442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.684527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.684568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.685218 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.685270 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.701542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"certified-operators-49w6l\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:55 crc kubenswrapper[4898]: I0313 14:12:55.778340 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.097735 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.098176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.102515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.102537 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"collector-5nmqm\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.279054 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.444425 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.445013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46"} Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.445068 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"cfd1bebb0f1b2036070fc4a91966055ef2a5218eac13ce03c1dd361aa46a7a97"} Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.509471 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604506 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604559 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604651 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604679 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604705 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604751 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604782 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604825 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.604950 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") pod \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\" (UID: \"45b41ab9-a5cd-41ec-8714-9d13c0ca0550\") " Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.605582 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir" (OuterVolumeSpecName: "datadir") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.605747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.606329 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.606936 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config" (OuterVolumeSpecName: "config") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.607630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612582 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token" (OuterVolumeSpecName: "collector-token") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612625 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics" (OuterVolumeSpecName: "metrics") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612646 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp" (OuterVolumeSpecName: "tmp") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token" (OuterVolumeSpecName: "sa-token") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.612892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq" (OuterVolumeSpecName: "kube-api-access-rmldq") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "kube-api-access-rmldq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.613338 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "45b41ab9-a5cd-41ec-8714-9d13c0ca0550" (UID: "45b41ab9-a5cd-41ec-8714-9d13c0ca0550"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706420 4898 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-datadir\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706452 4898 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706462 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmldq\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-kube-api-access-rmldq\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706471 4898 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706479 4898 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706486 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706495 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706505 4898 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-collector-token\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706513 4898 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-tmp\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706521 4898 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:56 crc kubenswrapper[4898]: I0313 14:12:56.706528 4898 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/45b41ab9-a5cd-41ec-8714-9d13c0ca0550-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.452988 4898 generic.go:334] "Generic (PLEG): container finished" podID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" exitCode=0 Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.453063 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-5nmqm" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.453037 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46"} Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.540800 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.551026 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-5nmqm"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.556312 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-xcq52"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.557637 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562631 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562651 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562717 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-6h2rk" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.562835 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.563162 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.564617 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-xcq52"] Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.573398 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.730814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d10e9-5cdc-4dc5-b9a8-b151c779b900-tmp\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.730944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-metrics\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config-openshift-service-cacrt\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731197 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm26g\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-kube-api-access-rm26g\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/824d10e9-5cdc-4dc5-b9a8-b151c779b900-datadir\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731401 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-entrypoint\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731457 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-sa-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-trusted-ca\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.731693 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-syslog-receiver\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.750094 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b41ab9-a5cd-41ec-8714-9d13c0ca0550" path="/var/lib/kubelet/pods/45b41ab9-a5cd-41ec-8714-9d13c0ca0550/volumes" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833326 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-entrypoint\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833373 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-sa-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-trusted-ca\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-syslog-receiver\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d10e9-5cdc-4dc5-b9a8-b151c779b900-tmp\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833520 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-metrics\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833539 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config-openshift-service-cacrt\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833583 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm26g\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-kube-api-access-rm26g\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/824d10e9-5cdc-4dc5-b9a8-b151c779b900-datadir\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.833849 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/824d10e9-5cdc-4dc5-b9a8-b151c779b900-datadir\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.834717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.834841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-trusted-ca\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.835118 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-config-openshift-service-cacrt\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.835406 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/824d10e9-5cdc-4dc5-b9a8-b151c779b900-entrypoint\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.840858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-metrics\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.841174 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-syslog-receiver\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.841257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/824d10e9-5cdc-4dc5-b9a8-b151c779b900-tmp\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.842716 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/824d10e9-5cdc-4dc5-b9a8-b151c779b900-collector-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.853563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-sa-token\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.856578 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm26g\" (UniqueName: \"kubernetes.io/projected/824d10e9-5cdc-4dc5-b9a8-b151c779b900-kube-api-access-rm26g\") pod \"collector-xcq52\" (UID: \"824d10e9-5cdc-4dc5-b9a8-b151c779b900\") " pod="openshift-logging/collector-xcq52" Mar 13 14:12:57 crc kubenswrapper[4898]: I0313 14:12:57.880628 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-xcq52" Mar 13 14:12:58 crc kubenswrapper[4898]: I0313 14:12:58.341400 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-xcq52"] Mar 13 14:12:58 crc kubenswrapper[4898]: W0313 14:12:58.348463 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824d10e9_5cdc_4dc5_b9a8_b151c779b900.slice/crio-f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586 WatchSource:0}: Error finding container f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586: Status 404 returned error can't find the container with id f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586 Mar 13 14:12:58 crc kubenswrapper[4898]: I0313 14:12:58.464399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-xcq52" event={"ID":"824d10e9-5cdc-4dc5-b9a8-b151c779b900","Type":"ContainerStarted","Data":"f807e9ba698af05d63ce18a3e24d716918f9ef4a1af45c0c0ecafbc443a73586"} Mar 13 14:12:58 crc kubenswrapper[4898]: I0313 14:12:58.467014 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477"} Mar 13 14:12:59 crc kubenswrapper[4898]: I0313 14:12:59.478274 4898 generic.go:334] "Generic (PLEG): container finished" podID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" exitCode=0 Mar 13 14:12:59 crc kubenswrapper[4898]: I0313 14:12:59.478317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477"} Mar 13 14:13:02 crc kubenswrapper[4898]: I0313 14:13:02.510476 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-xcq52" event={"ID":"824d10e9-5cdc-4dc5-b9a8-b151c779b900","Type":"ContainerStarted","Data":"adc0a5c829ef5909ddc3032fabfbdbc6824e25e07fdf12185f17c84a2adb5373"} Mar 13 14:13:02 crc kubenswrapper[4898]: I0313 14:13:02.550527 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-xcq52" podStartSLOduration=2.063796941 podStartE2EDuration="5.550495389s" podCreationTimestamp="2026-03-13 14:12:57 +0000 UTC" firstStartedPulling="2026-03-13 14:12:58.351112913 +0000 UTC m=+1013.352701162" lastFinishedPulling="2026-03-13 14:13:01.837811371 +0000 UTC m=+1016.839399610" observedRunningTime="2026-03-13 14:13:02.532263199 +0000 UTC m=+1017.533851438" watchObservedRunningTime="2026-03-13 14:13:02.550495389 +0000 UTC m=+1017.552083688" Mar 13 14:13:04 crc kubenswrapper[4898]: I0313 14:13:04.528279 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerStarted","Data":"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e"} Mar 13 14:13:04 crc kubenswrapper[4898]: I0313 14:13:04.566743 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49w6l" podStartSLOduration=3.048838095 podStartE2EDuration="9.566705849s" podCreationTimestamp="2026-03-13 14:12:55 +0000 UTC" firstStartedPulling="2026-03-13 14:12:57.455393655 +0000 UTC m=+1012.456981934" lastFinishedPulling="2026-03-13 14:13:03.973261439 +0000 UTC m=+1018.974849688" observedRunningTime="2026-03-13 14:13:04.556830435 +0000 UTC m=+1019.558418714" watchObservedRunningTime="2026-03-13 14:13:04.566705849 +0000 UTC m=+1019.568294128" Mar 13 14:13:05 crc kubenswrapper[4898]: I0313 14:13:05.778857 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:05 crc kubenswrapper[4898]: I0313 14:13:05.779330 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:06 crc kubenswrapper[4898]: I0313 14:13:06.822975 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-49w6l" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" probeResult="failure" output=< Mar 13 14:13:06 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:13:06 crc kubenswrapper[4898]: > Mar 13 14:13:15 crc kubenswrapper[4898]: I0313 14:13:15.835082 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:15 crc kubenswrapper[4898]: I0313 14:13:15.894730 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:16 crc kubenswrapper[4898]: I0313 14:13:16.069042 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:13:17 crc kubenswrapper[4898]: I0313 14:13:17.687768 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49w6l" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" containerID="cri-o://8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" gracePeriod=2 Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.105783 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.175610 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") pod \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.175722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") pod \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.175923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") pod \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\" (UID: \"dabf24b2-a9e2-4f67-91fd-1625e8ab3196\") " Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.177655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities" (OuterVolumeSpecName: "utilities") pod "dabf24b2-a9e2-4f67-91fd-1625e8ab3196" (UID: "dabf24b2-a9e2-4f67-91fd-1625e8ab3196"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.181687 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk" (OuterVolumeSpecName: "kube-api-access-xdrmk") pod "dabf24b2-a9e2-4f67-91fd-1625e8ab3196" (UID: "dabf24b2-a9e2-4f67-91fd-1625e8ab3196"). InnerVolumeSpecName "kube-api-access-xdrmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.233218 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dabf24b2-a9e2-4f67-91fd-1625e8ab3196" (UID: "dabf24b2-a9e2-4f67-91fd-1625e8ab3196"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.277227 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.277270 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdrmk\" (UniqueName: \"kubernetes.io/projected/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-kube-api-access-xdrmk\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.277280 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dabf24b2-a9e2-4f67-91fd-1625e8ab3196-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.699744 4898 generic.go:334] "Generic (PLEG): container finished" podID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" exitCode=0 Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.699833 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49w6l" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.699818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e"} Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.700261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49w6l" event={"ID":"dabf24b2-a9e2-4f67-91fd-1625e8ab3196","Type":"ContainerDied","Data":"cfd1bebb0f1b2036070fc4a91966055ef2a5218eac13ce03c1dd361aa46a7a97"} Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.700300 4898 scope.go:117] "RemoveContainer" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.739538 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.740813 4898 scope.go:117] "RemoveContainer" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.747830 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49w6l"] Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.761250 4898 scope.go:117] "RemoveContainer" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.783044 4898 scope.go:117] "RemoveContainer" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" Mar 13 14:13:18 crc kubenswrapper[4898]: E0313 14:13:18.783662 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e\": container with ID starting with 8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e not found: ID does not exist" containerID="8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.783717 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e"} err="failed to get container status \"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e\": rpc error: code = NotFound desc = could not find container \"8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e\": container with ID starting with 8159c15216bb6b05395c9b113a6419beaf598e1519e0565295ce1194c8cfdd8e not found: ID does not exist" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.783743 4898 scope.go:117] "RemoveContainer" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" Mar 13 14:13:18 crc kubenswrapper[4898]: E0313 14:13:18.784079 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477\": container with ID starting with 92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477 not found: ID does not exist" containerID="92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.784100 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477"} err="failed to get container status \"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477\": rpc error: code = NotFound desc = could not find container \"92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477\": container with ID starting with 92189cd615cdfc6491f05f4f4284efe8b33b7bdc8049b452c130597e8d121477 not found: ID does not exist" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.784150 4898 scope.go:117] "RemoveContainer" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" Mar 13 14:13:18 crc kubenswrapper[4898]: E0313 14:13:18.784389 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46\": container with ID starting with eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46 not found: ID does not exist" containerID="eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46" Mar 13 14:13:18 crc kubenswrapper[4898]: I0313 14:13:18.784420 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46"} err="failed to get container status \"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46\": rpc error: code = NotFound desc = could not find container \"eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46\": container with ID starting with eec0586c952301178a801e3a4bd7bdef5aee6ad7078781088b52abdeffb5eb46 not found: ID does not exist" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.134731 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.134831 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.134946 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.136472 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.136613 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6" gracePeriod=600 Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.709856 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6" exitCode=0 Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.709944 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6"} Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.710712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed"} Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.710796 4898 scope.go:117] "RemoveContainer" containerID="5a348cbe99f8e01e53545f65e722853afafc6c3cafe54ec4136fd0f288299e87" Mar 13 14:13:19 crc kubenswrapper[4898]: I0313 14:13:19.753868 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" path="/var/lib/kubelet/pods/dabf24b2-a9e2-4f67-91fd-1625e8ab3196/volumes" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.219078 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh"] Mar 13 14:13:36 crc kubenswrapper[4898]: E0313 14:13:36.219956 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-content" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.219971 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-content" Mar 13 14:13:36 crc kubenswrapper[4898]: E0313 14:13:36.219986 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.219995 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" Mar 13 14:13:36 crc kubenswrapper[4898]: E0313 14:13:36.220005 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-utilities" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.220014 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="extract-utilities" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.220171 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabf24b2-a9e2-4f67-91fd-1625e8ab3196" containerName="registry-server" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.221406 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.223997 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.230676 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh"] Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.280230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.280293 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.280320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.382040 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.382313 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.382346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.383058 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.383840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.404862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:36 crc kubenswrapper[4898]: I0313 14:13:36.543600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.033856 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh"] Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.860994 4898 generic.go:334] "Generic (PLEG): container finished" podID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerID="422f90857314e3fc7868e74b78a746d6e5e6560e9451136c0b0897c2dd1d6ab7" exitCode=0 Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.861054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"422f90857314e3fc7868e74b78a746d6e5e6560e9451136c0b0897c2dd1d6ab7"} Mar 13 14:13:37 crc kubenswrapper[4898]: I0313 14:13:37.861089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerStarted","Data":"527fe8e97992861f8440939a2bfd5b861f20af1baa416c51694239c5c3dd1778"} Mar 13 14:13:39 crc kubenswrapper[4898]: I0313 14:13:39.875477 4898 generic.go:334] "Generic (PLEG): container finished" podID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerID="002e8c3dd41fac16b9a3c92a467e8e12ac12ce2cbb5f7c5ab5b3b50c26ff7e4a" exitCode=0 Mar 13 14:13:39 crc kubenswrapper[4898]: I0313 14:13:39.875554 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"002e8c3dd41fac16b9a3c92a467e8e12ac12ce2cbb5f7c5ab5b3b50c26ff7e4a"} Mar 13 14:13:40 crc kubenswrapper[4898]: I0313 14:13:40.884797 4898 generic.go:334] "Generic (PLEG): container finished" podID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerID="2d43404568d4928eb590843474ccba68a6b5932b3bb07dac9c8d1ba42b0f996e" exitCode=0 Mar 13 14:13:40 crc kubenswrapper[4898]: I0313 14:13:40.884841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"2d43404568d4928eb590843474ccba68a6b5932b3bb07dac9c8d1ba42b0f996e"} Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.211178 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.392597 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") pod \"53800f20-93f5-4ab5-9feb-eb325fa0f945\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.392718 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") pod \"53800f20-93f5-4ab5-9feb-eb325fa0f945\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.392812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") pod \"53800f20-93f5-4ab5-9feb-eb325fa0f945\" (UID: \"53800f20-93f5-4ab5-9feb-eb325fa0f945\") " Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.393359 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle" (OuterVolumeSpecName: "bundle") pod "53800f20-93f5-4ab5-9feb-eb325fa0f945" (UID: "53800f20-93f5-4ab5-9feb-eb325fa0f945"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.398333 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv" (OuterVolumeSpecName: "kube-api-access-zrwdv") pod "53800f20-93f5-4ab5-9feb-eb325fa0f945" (UID: "53800f20-93f5-4ab5-9feb-eb325fa0f945"). InnerVolumeSpecName "kube-api-access-zrwdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.406249 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util" (OuterVolumeSpecName: "util") pod "53800f20-93f5-4ab5-9feb-eb325fa0f945" (UID: "53800f20-93f5-4ab5-9feb-eb325fa0f945"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.494439 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.494679 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53800f20-93f5-4ab5-9feb-eb325fa0f945-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.494691 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrwdv\" (UniqueName: \"kubernetes.io/projected/53800f20-93f5-4ab5-9feb-eb325fa0f945-kube-api-access-zrwdv\") on node \"crc\" DevicePath \"\"" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.900487 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" event={"ID":"53800f20-93f5-4ab5-9feb-eb325fa0f945","Type":"ContainerDied","Data":"527fe8e97992861f8440939a2bfd5b861f20af1baa416c51694239c5c3dd1778"} Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.900535 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="527fe8e97992861f8440939a2bfd5b861f20af1baa416c51694239c5c3dd1778" Mar 13 14:13:42 crc kubenswrapper[4898]: I0313 14:13:42.900609 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.343730 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2"] Mar 13 14:13:45 crc kubenswrapper[4898]: E0313 14:13:45.344017 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="extract" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344029 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="extract" Mar 13 14:13:45 crc kubenswrapper[4898]: E0313 14:13:45.344036 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="pull" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344043 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="pull" Mar 13 14:13:45 crc kubenswrapper[4898]: E0313 14:13:45.344071 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="util" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344078 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="util" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344222 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="53800f20-93f5-4ab5-9feb-eb325fa0f945" containerName="extract" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.344814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.347619 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.347618 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.348435 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-b8t8b" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.360978 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2"] Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.442001 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg64p\" (UniqueName: \"kubernetes.io/projected/84d4e279-f74c-48fd-9514-1a697341ac6a-kube-api-access-mg64p\") pod \"nmstate-operator-796d4cfff4-hmwt2\" (UID: \"84d4e279-f74c-48fd-9514-1a697341ac6a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.543457 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg64p\" (UniqueName: \"kubernetes.io/projected/84d4e279-f74c-48fd-9514-1a697341ac6a-kube-api-access-mg64p\") pod \"nmstate-operator-796d4cfff4-hmwt2\" (UID: \"84d4e279-f74c-48fd-9514-1a697341ac6a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.560451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg64p\" (UniqueName: \"kubernetes.io/projected/84d4e279-f74c-48fd-9514-1a697341ac6a-kube-api-access-mg64p\") pod \"nmstate-operator-796d4cfff4-hmwt2\" (UID: \"84d4e279-f74c-48fd-9514-1a697341ac6a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:45 crc kubenswrapper[4898]: I0313 14:13:45.691070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" Mar 13 14:13:46 crc kubenswrapper[4898]: I0313 14:13:46.267795 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2"] Mar 13 14:13:46 crc kubenswrapper[4898]: W0313 14:13:46.271412 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d4e279_f74c_48fd_9514_1a697341ac6a.slice/crio-a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5 WatchSource:0}: Error finding container a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5: Status 404 returned error can't find the container with id a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5 Mar 13 14:13:46 crc kubenswrapper[4898]: I0313 14:13:46.930248 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" event={"ID":"84d4e279-f74c-48fd-9514-1a697341ac6a","Type":"ContainerStarted","Data":"a7db08a5effaf4f91c8d554abc0403da376e8417dc4b0c1f63250f72d50e88d5"} Mar 13 14:13:49 crc kubenswrapper[4898]: I0313 14:13:49.955616 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" event={"ID":"84d4e279-f74c-48fd-9514-1a697341ac6a","Type":"ContainerStarted","Data":"ffab93626eb911b918b7ea9fec209fc64c303e83994c82fcd6ec8b826f9cc21f"} Mar 13 14:13:49 crc kubenswrapper[4898]: I0313 14:13:49.980377 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-hmwt2" podStartSLOduration=1.966667137 podStartE2EDuration="4.9803488s" podCreationTimestamp="2026-03-13 14:13:45 +0000 UTC" firstStartedPulling="2026-03-13 14:13:46.274486832 +0000 UTC m=+1061.276075071" lastFinishedPulling="2026-03-13 14:13:49.288168495 +0000 UTC m=+1064.289756734" observedRunningTime="2026-03-13 14:13:49.97196283 +0000 UTC m=+1064.973551139" watchObservedRunningTime="2026-03-13 14:13:49.9803488 +0000 UTC m=+1064.981937049" Mar 13 14:13:50 crc kubenswrapper[4898]: I0313 14:13:50.975148 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.976317 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.985206 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.986943 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.987616 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zbnmq" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:50.987622 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.002650 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.012565 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.012595 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-fpgr7"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.013383 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tqk2\" (UniqueName: \"kubernetes.io/projected/a9193e72-6911-4df4-8b26-04b2537f68a9-kube-api-access-5tqk2\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123763 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-ovs-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123797 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwrb\" (UniqueName: \"kubernetes.io/projected/e4761153-ed4e-4264-8f21-b4de31a4bbb8-kube-api-access-5zwrb\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.123864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfp9h\" (UniqueName: \"kubernetes.io/projected/35105fc0-dff0-4480-8635-cbbeec82d124-kube-api-access-hfp9h\") pod \"nmstate-metrics-9b8c8685d-c8fgd\" (UID: \"35105fc0-dff0-4480-8635-cbbeec82d124\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.124086 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-nmstate-lock\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.124115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-dbus-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.124159 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a9193e72-6911-4df4-8b26-04b2537f68a9-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.147445 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.148549 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.157374 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.157630 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-49nnv" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.157696 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.168425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225581 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-ovs-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwrb\" (UniqueName: \"kubernetes.io/projected/e4761153-ed4e-4264-8f21-b4de31a4bbb8-kube-api-access-5zwrb\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225717 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfp9h\" (UniqueName: \"kubernetes.io/projected/35105fc0-dff0-4480-8635-cbbeec82d124-kube-api-access-hfp9h\") pod \"nmstate-metrics-9b8c8685d-c8fgd\" (UID: \"35105fc0-dff0-4480-8635-cbbeec82d124\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225748 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-nmstate-lock\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-dbus-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a9193e72-6911-4df4-8b26-04b2537f68a9-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.225886 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tqk2\" (UniqueName: \"kubernetes.io/projected/a9193e72-6911-4df4-8b26-04b2537f68a9-kube-api-access-5tqk2\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.226466 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-ovs-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.226812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-nmstate-lock\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.227099 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e4761153-ed4e-4264-8f21-b4de31a4bbb8-dbus-socket\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.234952 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a9193e72-6911-4df4-8b26-04b2537f68a9-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.247107 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwrb\" (UniqueName: \"kubernetes.io/projected/e4761153-ed4e-4264-8f21-b4de31a4bbb8-kube-api-access-5zwrb\") pod \"nmstate-handler-fpgr7\" (UID: \"e4761153-ed4e-4264-8f21-b4de31a4bbb8\") " pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.249093 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tqk2\" (UniqueName: \"kubernetes.io/projected/a9193e72-6911-4df4-8b26-04b2537f68a9-kube-api-access-5tqk2\") pod \"nmstate-webhook-5f558f5558-m8j8d\" (UID: \"a9193e72-6911-4df4-8b26-04b2537f68a9\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.277636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfp9h\" (UniqueName: \"kubernetes.io/projected/35105fc0-dff0-4480-8635-cbbeec82d124-kube-api-access-hfp9h\") pod \"nmstate-metrics-9b8c8685d-c8fgd\" (UID: \"35105fc0-dff0-4480-8635-cbbeec82d124\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.327563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b707c4ee-39e1-4fc6-812a-f61e722c1079-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.327681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjdb9\" (UniqueName: \"kubernetes.io/projected/b707c4ee-39e1-4fc6-812a-f61e722c1079-kube-api-access-pjdb9\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.327704 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b707c4ee-39e1-4fc6-812a-f61e722c1079-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.339288 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.356314 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.357420 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.359479 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.368223 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.379421 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.429261 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b707c4ee-39e1-4fc6-812a-f61e722c1079-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.429428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjdb9\" (UniqueName: \"kubernetes.io/projected/b707c4ee-39e1-4fc6-812a-f61e722c1079-kube-api-access-pjdb9\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.429474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b707c4ee-39e1-4fc6-812a-f61e722c1079-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.430861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b707c4ee-39e1-4fc6-812a-f61e722c1079-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.433773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/b707c4ee-39e1-4fc6-812a-f61e722c1079-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: W0313 14:13:51.440050 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4761153_ed4e_4264_8f21_b4de31a4bbb8.slice/crio-f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386 WatchSource:0}: Error finding container f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386: Status 404 returned error can't find the container with id f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386 Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.452806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjdb9\" (UniqueName: \"kubernetes.io/projected/b707c4ee-39e1-4fc6-812a-f61e722c1079-kube-api-access-pjdb9\") pod \"nmstate-console-plugin-86f58fcf4-9m8s6\" (UID: \"b707c4ee-39e1-4fc6-812a-f61e722c1079\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.471092 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.530843 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.530891 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531151 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531173 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531210 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.531262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633187 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633250 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.633305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.634428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.634727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.635220 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.635245 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.649226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.649412 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.652343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"console-6ddbb5776b-mx8sz\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.733787 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.879067 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d"] Mar 13 14:13:51 crc kubenswrapper[4898]: W0313 14:13:51.889148 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9193e72_6911_4df4_8b26_04b2537f68a9.slice/crio-c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079 WatchSource:0}: Error finding container c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079: Status 404 returned error can't find the container with id c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079 Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.926574 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd"] Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.980187 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" event={"ID":"a9193e72-6911-4df4-8b26-04b2537f68a9","Type":"ContainerStarted","Data":"c11c0b93f3e43194c0a8f9c279c5835e0bb1a5fbe1e76e42b208c898761e9079"} Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.982370 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fpgr7" event={"ID":"e4761153-ed4e-4264-8f21-b4de31a4bbb8","Type":"ContainerStarted","Data":"f3cf77f913a23536b768c77e85d72ab96f7623b941d7622636b49e942dcd6386"} Mar 13 14:13:51 crc kubenswrapper[4898]: I0313 14:13:51.983432 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" event={"ID":"35105fc0-dff0-4480-8635-cbbeec82d124","Type":"ContainerStarted","Data":"d8eb96047c72aec16ebf70fe77d6b642481a3aa674e6c39193660ebe147346e9"} Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.013824 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6"] Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.180610 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:13:52 crc kubenswrapper[4898]: W0313 14:13:52.184514 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6fd2de_efa6_4d17_aa5e_4f44ced1f822.slice/crio-25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa WatchSource:0}: Error finding container 25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa: Status 404 returned error can't find the container with id 25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.994938 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" event={"ID":"b707c4ee-39e1-4fc6-812a-f61e722c1079","Type":"ContainerStarted","Data":"4f5b664ce3a49a4a9347e418a7f4e569dd9d364a61493834d164b5cf58547792"} Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.996219 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerStarted","Data":"5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc"} Mar 13 14:13:52 crc kubenswrapper[4898]: I0313 14:13:52.996261 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerStarted","Data":"25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa"} Mar 13 14:13:53 crc kubenswrapper[4898]: I0313 14:13:53.023424 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ddbb5776b-mx8sz" podStartSLOduration=2.023407525 podStartE2EDuration="2.023407525s" podCreationTimestamp="2026-03-13 14:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:13:53.021328751 +0000 UTC m=+1068.022917000" watchObservedRunningTime="2026-03-13 14:13:53.023407525 +0000 UTC m=+1068.024995774" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.049580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-fpgr7" event={"ID":"e4761153-ed4e-4264-8f21-b4de31a4bbb8","Type":"ContainerStarted","Data":"65a143f64e3c36ce5848d2ec35e4a19d110bc74a2f71a90fed40c46bcaecaa29"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.049943 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.051348 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" event={"ID":"b707c4ee-39e1-4fc6-812a-f61e722c1079","Type":"ContainerStarted","Data":"67a5158448d0e030c1a30934e7dd23db401ce35e94a196becdb13aa6078a4b98"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.053727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" event={"ID":"35105fc0-dff0-4480-8635-cbbeec82d124","Type":"ContainerStarted","Data":"3e7a7b37583b1e5e3b8f111f7438b8d95b9b9b3c1c267e26122804512d690869"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.054887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" event={"ID":"a9193e72-6911-4df4-8b26-04b2537f68a9","Type":"ContainerStarted","Data":"b3f66f08bca8f7bcd172456023ef18bc5bdca02a16923c5dda15fb815f41cda5"} Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.055069 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.092379 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-fpgr7" podStartSLOduration=2.292144794 podStartE2EDuration="6.092359761s" podCreationTimestamp="2026-03-13 14:13:50 +0000 UTC" firstStartedPulling="2026-03-13 14:13:51.454966881 +0000 UTC m=+1066.456555120" lastFinishedPulling="2026-03-13 14:13:55.255181848 +0000 UTC m=+1070.256770087" observedRunningTime="2026-03-13 14:13:56.087346779 +0000 UTC m=+1071.088935038" watchObservedRunningTime="2026-03-13 14:13:56.092359761 +0000 UTC m=+1071.093948000" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.121494 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" podStartSLOduration=2.788425762 podStartE2EDuration="6.121467025s" podCreationTimestamp="2026-03-13 14:13:50 +0000 UTC" firstStartedPulling="2026-03-13 14:13:51.89305419 +0000 UTC m=+1066.894642429" lastFinishedPulling="2026-03-13 14:13:55.226095453 +0000 UTC m=+1070.227683692" observedRunningTime="2026-03-13 14:13:56.112183211 +0000 UTC m=+1071.113771450" watchObservedRunningTime="2026-03-13 14:13:56.121467025 +0000 UTC m=+1071.123055274" Mar 13 14:13:56 crc kubenswrapper[4898]: I0313 14:13:56.133935 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9m8s6" podStartSLOduration=1.92650786 podStartE2EDuration="5.133883152s" podCreationTimestamp="2026-03-13 14:13:51 +0000 UTC" firstStartedPulling="2026-03-13 14:13:52.017812968 +0000 UTC m=+1067.019401207" lastFinishedPulling="2026-03-13 14:13:55.22518826 +0000 UTC m=+1070.226776499" observedRunningTime="2026-03-13 14:13:56.127015451 +0000 UTC m=+1071.128603690" watchObservedRunningTime="2026-03-13 14:13:56.133883152 +0000 UTC m=+1071.135471401" Mar 13 14:13:59 crc kubenswrapper[4898]: I0313 14:13:59.103418 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" event={"ID":"35105fc0-dff0-4480-8635-cbbeec82d124","Type":"ContainerStarted","Data":"b3ef85f675b604ef99b0a9b009e4b0d613504667c6746bbb6830e5276d1293d7"} Mar 13 14:13:59 crc kubenswrapper[4898]: I0313 14:13:59.130582 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-c8fgd" podStartSLOduration=2.72853263 podStartE2EDuration="9.130553609s" podCreationTimestamp="2026-03-13 14:13:50 +0000 UTC" firstStartedPulling="2026-03-13 14:13:51.936743408 +0000 UTC m=+1066.938331647" lastFinishedPulling="2026-03-13 14:13:58.338764387 +0000 UTC m=+1073.340352626" observedRunningTime="2026-03-13 14:13:59.12717982 +0000 UTC m=+1074.128768109" watchObservedRunningTime="2026-03-13 14:13:59.130553609 +0000 UTC m=+1074.132141898" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.139644 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.141266 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.144175 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.144415 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.145632 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.153322 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.305539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"auto-csr-approver-29556854-6mxhz\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.407501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"auto-csr-approver-29556854-6mxhz\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.429843 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"auto-csr-approver-29556854-6mxhz\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.464434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:00 crc kubenswrapper[4898]: I0313 14:14:00.941758 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.122036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerStarted","Data":"12a8614ae5db33ffe748f5edebb3d67a71dd9d1797f096a66e2d95044041c69a"} Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.404146 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-fpgr7" Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.735039 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.735141 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:01 crc kubenswrapper[4898]: I0313 14:14:01.750834 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.129284 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerStarted","Data":"ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38"} Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.134220 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.143963 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" podStartSLOduration=1.354426773 podStartE2EDuration="2.143938114s" podCreationTimestamp="2026-03-13 14:14:00 +0000 UTC" firstStartedPulling="2026-03-13 14:14:00.948357255 +0000 UTC m=+1075.949945494" lastFinishedPulling="2026-03-13 14:14:01.737868596 +0000 UTC m=+1076.739456835" observedRunningTime="2026-03-13 14:14:02.14072379 +0000 UTC m=+1077.142312029" watchObservedRunningTime="2026-03-13 14:14:02.143938114 +0000 UTC m=+1077.145526353" Mar 13 14:14:02 crc kubenswrapper[4898]: I0313 14:14:02.207786 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:14:03 crc kubenswrapper[4898]: I0313 14:14:03.136530 4898 generic.go:334] "Generic (PLEG): container finished" podID="35372caa-772c-434c-8fb2-3b82926c1521" containerID="ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38" exitCode=0 Mar 13 14:14:03 crc kubenswrapper[4898]: I0313 14:14:03.136644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerDied","Data":"ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38"} Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.481340 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.584396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") pod \"35372caa-772c-434c-8fb2-3b82926c1521\" (UID: \"35372caa-772c-434c-8fb2-3b82926c1521\") " Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.592891 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd" (OuterVolumeSpecName: "kube-api-access-xz5zd") pod "35372caa-772c-434c-8fb2-3b82926c1521" (UID: "35372caa-772c-434c-8fb2-3b82926c1521"). InnerVolumeSpecName "kube-api-access-xz5zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:14:04 crc kubenswrapper[4898]: I0313 14:14:04.686614 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz5zd\" (UniqueName: \"kubernetes.io/projected/35372caa-772c-434c-8fb2-3b82926c1521-kube-api-access-xz5zd\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.156669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" event={"ID":"35372caa-772c-434c-8fb2-3b82926c1521","Type":"ContainerDied","Data":"12a8614ae5db33ffe748f5edebb3d67a71dd9d1797f096a66e2d95044041c69a"} Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.157054 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12a8614ae5db33ffe748f5edebb3d67a71dd9d1797f096a66e2d95044041c69a" Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.156750 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556854-6mxhz" Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.216487 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.225686 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556848-wlplx"] Mar 13 14:14:05 crc kubenswrapper[4898]: I0313 14:14:05.754900 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4a848e-c06e-4205-a1a6-8b14b620096c" path="/var/lib/kubelet/pods/fe4a848e-c06e-4205-a1a6-8b14b620096c/volumes" Mar 13 14:14:11 crc kubenswrapper[4898]: I0313 14:14:11.347926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.253133 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-758c8fb5b-pxts9" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" containerID="cri-o://4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" gracePeriod=15 Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.683607 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758c8fb5b-pxts9_571e1a76-1585-4c39-887c-d9c3f735a908/console/0.log" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.683668 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840604 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840889 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840969 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.840986 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841007 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") pod \"571e1a76-1585-4c39-887c-d9c3f735a908\" (UID: \"571e1a76-1585-4c39-887c-d9c3f735a908\") " Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841428 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841489 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config" (OuterVolumeSpecName: "console-config") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.841693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca" (OuterVolumeSpecName: "service-ca") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.845398 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p" (OuterVolumeSpecName: "kube-api-access-twb6p") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "kube-api-access-twb6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.852079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.863189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "571e1a76-1585-4c39-887c-d9c3f735a908" (UID: "571e1a76-1585-4c39-887c-d9c3f735a908"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945326 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945365 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945379 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945389 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945401 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/571e1a76-1585-4c39-887c-d9c3f735a908-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945414 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/571e1a76-1585-4c39-887c-d9c3f735a908-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:27 crc kubenswrapper[4898]: I0313 14:14:27.945426 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twb6p\" (UniqueName: \"kubernetes.io/projected/571e1a76-1585-4c39-887c-d9c3f735a908-kube-api-access-twb6p\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367066 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758c8fb5b-pxts9_571e1a76-1585-4c39-887c-d9c3f735a908/console/0.log" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367123 4898 generic.go:334] "Generic (PLEG): container finished" podID="571e1a76-1585-4c39-887c-d9c3f735a908" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" exitCode=2 Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367166 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerDied","Data":"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d"} Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367199 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758c8fb5b-pxts9" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758c8fb5b-pxts9" event={"ID":"571e1a76-1585-4c39-887c-d9c3f735a908","Type":"ContainerDied","Data":"4b09f73c7fa831fe94f3a344d5bf8593ff107c618a4ee0a2a0be061afa612208"} Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.367248 4898 scope.go:117] "RemoveContainer" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.402024 4898 scope.go:117] "RemoveContainer" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" Mar 13 14:14:28 crc kubenswrapper[4898]: E0313 14:14:28.402442 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d\": container with ID starting with 4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d not found: ID does not exist" containerID="4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.402491 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d"} err="failed to get container status \"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d\": rpc error: code = NotFound desc = could not find container \"4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d\": container with ID starting with 4cfca99c86a53c5141c727b6fd37b0d688489277e5d5aa3a145d30faadc4d08d not found: ID does not exist" Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.414874 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:14:28 crc kubenswrapper[4898]: I0313 14:14:28.423124 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-758c8fb5b-pxts9"] Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.754103 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" path="/var/lib/kubelet/pods/571e1a76-1585-4c39-887c-d9c3f735a908/volumes" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8"] Mar 13 14:14:29 crc kubenswrapper[4898]: E0313 14:14:29.957675 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957696 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" Mar 13 14:14:29 crc kubenswrapper[4898]: E0313 14:14:29.957722 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35372caa-772c-434c-8fb2-3b82926c1521" containerName="oc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957732 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="35372caa-772c-434c-8fb2-3b82926c1521" containerName="oc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957937 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="35372caa-772c-434c-8fb2-3b82926c1521" containerName="oc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.957964 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="571e1a76-1585-4c39-887c-d9c3f735a908" containerName="console" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.959684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.963143 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.975457 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8"] Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.976512 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.976593 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:29 crc kubenswrapper[4898]: I0313 14:14:29.976780 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.077809 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.077894 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.077940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.078556 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.078581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.100664 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.283830 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:30 crc kubenswrapper[4898]: I0313 14:14:30.739223 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8"] Mar 13 14:14:31 crc kubenswrapper[4898]: I0313 14:14:31.396014 4898 generic.go:334] "Generic (PLEG): container finished" podID="53fae31e-a97e-443d-88c2-fa38af842855" containerID="429d1ad392a73aabf01acc7812c562b1be79f59b798457e6e1a695312a7b362d" exitCode=0 Mar 13 14:14:31 crc kubenswrapper[4898]: I0313 14:14:31.396423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"429d1ad392a73aabf01acc7812c562b1be79f59b798457e6e1a695312a7b362d"} Mar 13 14:14:31 crc kubenswrapper[4898]: I0313 14:14:31.396479 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerStarted","Data":"323d3e074b5c24a04b5c3658c6de9f8d795ecd9605eb61941d76763b2c712df3"} Mar 13 14:14:32 crc kubenswrapper[4898]: I0313 14:14:32.632307 4898 scope.go:117] "RemoveContainer" containerID="e5c3875fd4b0ad4fd5d4afba4c88238837f0d8b510bd53eb8f51d4cd510b00e3" Mar 13 14:14:33 crc kubenswrapper[4898]: I0313 14:14:33.413339 4898 generic.go:334] "Generic (PLEG): container finished" podID="53fae31e-a97e-443d-88c2-fa38af842855" containerID="4012c0833e1dcb8cb1a98eb3cbe1c9311d9227c19a981851d7fa70805d402a65" exitCode=0 Mar 13 14:14:33 crc kubenswrapper[4898]: I0313 14:14:33.413391 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"4012c0833e1dcb8cb1a98eb3cbe1c9311d9227c19a981851d7fa70805d402a65"} Mar 13 14:14:34 crc kubenswrapper[4898]: I0313 14:14:34.427573 4898 generic.go:334] "Generic (PLEG): container finished" podID="53fae31e-a97e-443d-88c2-fa38af842855" containerID="45d423447e381b81b290026e1eb4ed79374436f22937a6fac96020fc29152f5d" exitCode=0 Mar 13 14:14:34 crc kubenswrapper[4898]: I0313 14:14:34.427663 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"45d423447e381b81b290026e1eb4ed79374436f22937a6fac96020fc29152f5d"} Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.763682 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.881625 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") pod \"53fae31e-a97e-443d-88c2-fa38af842855\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.881717 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") pod \"53fae31e-a97e-443d-88c2-fa38af842855\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.881803 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") pod \"53fae31e-a97e-443d-88c2-fa38af842855\" (UID: \"53fae31e-a97e-443d-88c2-fa38af842855\") " Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.883088 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle" (OuterVolumeSpecName: "bundle") pod "53fae31e-a97e-443d-88c2-fa38af842855" (UID: "53fae31e-a97e-443d-88c2-fa38af842855"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.887404 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt" (OuterVolumeSpecName: "kube-api-access-jrlvt") pod "53fae31e-a97e-443d-88c2-fa38af842855" (UID: "53fae31e-a97e-443d-88c2-fa38af842855"). InnerVolumeSpecName "kube-api-access-jrlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.897027 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util" (OuterVolumeSpecName: "util") pod "53fae31e-a97e-443d-88c2-fa38af842855" (UID: "53fae31e-a97e-443d-88c2-fa38af842855"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.983821 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.983862 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/53fae31e-a97e-443d-88c2-fa38af842855-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:35 crc kubenswrapper[4898]: I0313 14:14:35.983876 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrlvt\" (UniqueName: \"kubernetes.io/projected/53fae31e-a97e-443d-88c2-fa38af842855-kube-api-access-jrlvt\") on node \"crc\" DevicePath \"\"" Mar 13 14:14:36 crc kubenswrapper[4898]: I0313 14:14:36.452332 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" event={"ID":"53fae31e-a97e-443d-88c2-fa38af842855","Type":"ContainerDied","Data":"323d3e074b5c24a04b5c3658c6de9f8d795ecd9605eb61941d76763b2c712df3"} Mar 13 14:14:36 crc kubenswrapper[4898]: I0313 14:14:36.452403 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="323d3e074b5c24a04b5c3658c6de9f8d795ecd9605eb61941d76763b2c712df3" Mar 13 14:14:36 crc kubenswrapper[4898]: I0313 14:14:36.452415 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.477302 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx"] Mar 13 14:14:44 crc kubenswrapper[4898]: E0313 14:14:44.478200 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="util" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478216 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="util" Mar 13 14:14:44 crc kubenswrapper[4898]: E0313 14:14:44.478234 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="pull" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478241 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="pull" Mar 13 14:14:44 crc kubenswrapper[4898]: E0313 14:14:44.478256 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="extract" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478266 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="extract" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.478450 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="53fae31e-a97e-443d-88c2-fa38af842855" containerName="extract" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.479118 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.481800 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.482045 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.482167 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.482715 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-6d6mt" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.485802 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.495419 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx"] Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.617703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-apiservice-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.618092 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-webhook-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.618171 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztz9s\" (UniqueName: \"kubernetes.io/projected/e000d86e-e7a8-49ed-9184-fdd67dfe797d-kube-api-access-ztz9s\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.720120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-apiservice-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.720194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-webhook-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.720251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztz9s\" (UniqueName: \"kubernetes.io/projected/e000d86e-e7a8-49ed-9184-fdd67dfe797d-kube-api-access-ztz9s\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.726157 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-apiservice-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.726183 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e000d86e-e7a8-49ed-9184-fdd67dfe797d-webhook-cert\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.756675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztz9s\" (UniqueName: \"kubernetes.io/projected/e000d86e-e7a8-49ed-9184-fdd67dfe797d-kube-api-access-ztz9s\") pod \"metallb-operator-controller-manager-cf7c75c99-qxdbx\" (UID: \"e000d86e-e7a8-49ed-9184-fdd67dfe797d\") " pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.769409 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw"] Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.770529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.776406 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.777330 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.777506 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-7mmlg" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.790151 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw"] Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.801496 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.821856 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-apiservice-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.822014 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrzc\" (UniqueName: \"kubernetes.io/projected/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-kube-api-access-tgrzc\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.822111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-webhook-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.923172 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrzc\" (UniqueName: \"kubernetes.io/projected/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-kube-api-access-tgrzc\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.923306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-webhook-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.923339 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-apiservice-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.930618 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-apiservice-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.944010 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-webhook-cert\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:44 crc kubenswrapper[4898]: I0313 14:14:44.949803 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrzc\" (UniqueName: \"kubernetes.io/projected/34b4f98c-a87c-4a97-9ac4-286afeb9e4bc-kube-api-access-tgrzc\") pod \"metallb-operator-webhook-server-67c6f6c5cb-d26qw\" (UID: \"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc\") " pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.161663 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.325053 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx"] Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.567442 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerStarted","Data":"df1d17e4129d6fd4edf7571f204b641a8c16e11962457e0be5178efcce112d85"} Mar 13 14:14:45 crc kubenswrapper[4898]: I0313 14:14:45.650593 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw"] Mar 13 14:14:45 crc kubenswrapper[4898]: W0313 14:14:45.659164 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b4f98c_a87c_4a97_9ac4_286afeb9e4bc.slice/crio-16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0 WatchSource:0}: Error finding container 16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0: Status 404 returned error can't find the container with id 16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0 Mar 13 14:14:46 crc kubenswrapper[4898]: I0313 14:14:46.578767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" event={"ID":"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc","Type":"ContainerStarted","Data":"16db7a2ad5ebef1cec02a7e225cd353238ab67016ad829119d268c65e8e9f3f0"} Mar 13 14:14:49 crc kubenswrapper[4898]: I0313 14:14:49.603260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerStarted","Data":"b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f"} Mar 13 14:14:49 crc kubenswrapper[4898]: I0313 14:14:49.604073 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:14:49 crc kubenswrapper[4898]: I0313 14:14:49.636969 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" podStartSLOduration=2.5032037860000003 podStartE2EDuration="5.636947813s" podCreationTimestamp="2026-03-13 14:14:44 +0000 UTC" firstStartedPulling="2026-03-13 14:14:45.349171288 +0000 UTC m=+1120.350759527" lastFinishedPulling="2026-03-13 14:14:48.482915315 +0000 UTC m=+1123.484503554" observedRunningTime="2026-03-13 14:14:49.623461929 +0000 UTC m=+1124.625050168" watchObservedRunningTime="2026-03-13 14:14:49.636947813 +0000 UTC m=+1124.638536052" Mar 13 14:14:50 crc kubenswrapper[4898]: I0313 14:14:50.612781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" event={"ID":"34b4f98c-a87c-4a97-9ac4-286afeb9e4bc","Type":"ContainerStarted","Data":"78965365c8028eb7d18d67270a6ee0613c969f858c38f2b673207ae3e402b3bf"} Mar 13 14:14:50 crc kubenswrapper[4898]: I0313 14:14:50.640508 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podStartSLOduration=1.8932910120000002 podStartE2EDuration="6.640487877s" podCreationTimestamp="2026-03-13 14:14:44 +0000 UTC" firstStartedPulling="2026-03-13 14:14:45.661999916 +0000 UTC m=+1120.663588155" lastFinishedPulling="2026-03-13 14:14:50.409196781 +0000 UTC m=+1125.410785020" observedRunningTime="2026-03-13 14:14:50.635012693 +0000 UTC m=+1125.636600962" watchObservedRunningTime="2026-03-13 14:14:50.640487877 +0000 UTC m=+1125.642076116" Mar 13 14:14:51 crc kubenswrapper[4898]: I0313 14:14:51.621281 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.135360 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.137141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.139873 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.141381 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.150397 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.278353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.278429 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.278483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.380677 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.381088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.381155 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.382249 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.387172 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.400098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"collect-profiles-29556855-r5t9l\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.456305 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:00 crc kubenswrapper[4898]: I0313 14:15:00.919921 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 14:15:00 crc kubenswrapper[4898]: W0313 14:15:00.923519 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9b296c2_5046_40b3_9fca_be350cf5de3e.slice/crio-d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77 WatchSource:0}: Error finding container d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77: Status 404 returned error can't find the container with id d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77 Mar 13 14:15:01 crc kubenswrapper[4898]: I0313 14:15:01.698493 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerID="183f0c268935ae6820699911fc0be58b4d0e93db5e614c9661b0f4b96dcc6afd" exitCode=0 Mar 13 14:15:01 crc kubenswrapper[4898]: I0313 14:15:01.698606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" event={"ID":"d9b296c2-5046-40b3-9fca-be350cf5de3e","Type":"ContainerDied","Data":"183f0c268935ae6820699911fc0be58b4d0e93db5e614c9661b0f4b96dcc6afd"} Mar 13 14:15:01 crc kubenswrapper[4898]: I0313 14:15:01.698767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" event={"ID":"d9b296c2-5046-40b3-9fca-be350cf5de3e","Type":"ContainerStarted","Data":"d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77"} Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.130318 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.230886 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") pod \"d9b296c2-5046-40b3-9fca-be350cf5de3e\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.231213 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") pod \"d9b296c2-5046-40b3-9fca-be350cf5de3e\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.231345 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") pod \"d9b296c2-5046-40b3-9fca-be350cf5de3e\" (UID: \"d9b296c2-5046-40b3-9fca-be350cf5de3e\") " Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.235426 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9b296c2-5046-40b3-9fca-be350cf5de3e" (UID: "d9b296c2-5046-40b3-9fca-be350cf5de3e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.239035 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9b296c2-5046-40b3-9fca-be350cf5de3e" (UID: "d9b296c2-5046-40b3-9fca-be350cf5de3e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.239110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc" (OuterVolumeSpecName: "kube-api-access-9t2bc") pod "d9b296c2-5046-40b3-9fca-be350cf5de3e" (UID: "d9b296c2-5046-40b3-9fca-be350cf5de3e"). InnerVolumeSpecName "kube-api-access-9t2bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.333077 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b296c2-5046-40b3-9fca-be350cf5de3e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.333112 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t2bc\" (UniqueName: \"kubernetes.io/projected/d9b296c2-5046-40b3-9fca-be350cf5de3e-kube-api-access-9t2bc\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.333124 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b296c2-5046-40b3-9fca-be350cf5de3e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.718833 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" event={"ID":"d9b296c2-5046-40b3-9fca-be350cf5de3e","Type":"ContainerDied","Data":"d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77"} Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.718887 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2e883d004cc3649ee14f3853acd916269a70d8ad9d71dfe40bbbccb882dec77" Mar 13 14:15:03 crc kubenswrapper[4898]: I0313 14:15:03.718960 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l" Mar 13 14:15:05 crc kubenswrapper[4898]: I0313 14:15:05.169556 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" Mar 13 14:15:19 crc kubenswrapper[4898]: I0313 14:15:19.134501 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:15:19 crc kubenswrapper[4898]: I0313 14:15:19.135184 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:15:24 crc kubenswrapper[4898]: I0313 14:15:24.807572 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.716008 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-bqmxg"] Mar 13 14:15:25 crc kubenswrapper[4898]: E0313 14:15:25.716848 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerName="collect-profiles" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.716868 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerName="collect-profiles" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.717157 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" containerName="collect-profiles" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.723730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.730868 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-9sgbj" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.731148 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.731396 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.736032 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.739254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.749351 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.772026 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.820965 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-g5gqr"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.822417 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.824715 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-plgcw" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.824768 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.824921 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.825155 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.825872 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-cx422"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.827155 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.836731 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.854572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-cx422"] Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.863854 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-reloader\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.863936 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-conf\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.863981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864026 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmcwx\" (UniqueName: \"kubernetes.io/projected/604b9c21-3e85-4c2e-9faf-962f44236911-kube-api-access-pmcwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864077 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-metrics-certs\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/edfd91ee-1246-43b2-84a0-95ea069de402-metallb-excludel2\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864156 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpg26\" (UniqueName: \"kubernetes.io/projected/edfd91ee-1246-43b2-84a0-95ea069de402-kube-api-access-cpg26\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864195 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-startup\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864274 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-sockets\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics-certs\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv68p\" (UniqueName: \"kubernetes.io/projected/1a7fcb96-7168-4049-8c28-d3f740599e48-kube-api-access-rv68p\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.864467 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/604b9c21-3e85-4c2e-9faf-962f44236911-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965702 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmcwx\" (UniqueName: \"kubernetes.io/projected/604b9c21-3e85-4c2e-9faf-962f44236911-kube-api-access-pmcwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965766 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-metrics-certs\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/edfd91ee-1246-43b2-84a0-95ea069de402-metallb-excludel2\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965826 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpg26\" (UniqueName: \"kubernetes.io/projected/edfd91ee-1246-43b2-84a0-95ea069de402-kube-api-access-cpg26\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-startup\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-cert\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965886 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-metrics-certs\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-sockets\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965940 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics-certs\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv68p\" (UniqueName: \"kubernetes.io/projected/1a7fcb96-7168-4049-8c28-d3f740599e48-kube-api-access-rv68p\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.965974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb7vq\" (UniqueName: \"kubernetes.io/projected/b231c7db-5056-4ec6-a64c-0aa8bdff336b-kube-api-access-rb7vq\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966012 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/604b9c21-3e85-4c2e-9faf-962f44236911-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-reloader\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966060 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-conf\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: E0313 14:15:25.966198 4898 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 14:15:25 crc kubenswrapper[4898]: E0313 14:15:25.966237 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist podName:edfd91ee-1246-43b2-84a0-95ea069de402 nodeName:}" failed. No retries permitted until 2026-03-13 14:15:26.466223932 +0000 UTC m=+1161.467812171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist") pod "speaker-g5gqr" (UID: "edfd91ee-1246-43b2-84a0-95ea069de402") : secret "metallb-memberlist" not found Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966600 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-reloader\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/edfd91ee-1246-43b2-84a0-95ea069de402-metallb-excludel2\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.966978 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-conf\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.967282 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-startup\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.968173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1a7fcb96-7168-4049-8c28-d3f740599e48-frr-sockets\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.971972 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-metrics-certs\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.974205 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/604b9c21-3e85-4c2e-9faf-962f44236911-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.977732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a7fcb96-7168-4049-8c28-d3f740599e48-metrics-certs\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.981867 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv68p\" (UniqueName: \"kubernetes.io/projected/1a7fcb96-7168-4049-8c28-d3f740599e48-kube-api-access-rv68p\") pod \"frr-k8s-bqmxg\" (UID: \"1a7fcb96-7168-4049-8c28-d3f740599e48\") " pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.983361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpg26\" (UniqueName: \"kubernetes.io/projected/edfd91ee-1246-43b2-84a0-95ea069de402-kube-api-access-cpg26\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:25 crc kubenswrapper[4898]: I0313 14:15:25.988774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmcwx\" (UniqueName: \"kubernetes.io/projected/604b9c21-3e85-4c2e-9faf-962f44236911-kube-api-access-pmcwx\") pod \"frr-k8s-webhook-server-bcc4b6f68-5p4w5\" (UID: \"604b9c21-3e85-4c2e-9faf-962f44236911\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.056431 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.067098 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-cert\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.067436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-metrics-certs\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.067466 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb7vq\" (UniqueName: \"kubernetes.io/projected/b231c7db-5056-4ec6-a64c-0aa8bdff336b-kube-api-access-rb7vq\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.068684 4898 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.072023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-metrics-certs\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.082746 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b231c7db-5056-4ec6-a64c-0aa8bdff336b-cert\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.086319 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb7vq\" (UniqueName: \"kubernetes.io/projected/b231c7db-5056-4ec6-a64c-0aa8bdff336b-kube-api-access-rb7vq\") pod \"controller-7bb4cc7c98-cx422\" (UID: \"b231c7db-5056-4ec6-a64c-0aa8bdff336b\") " pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.089344 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.154964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.293425 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.475028 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:26 crc kubenswrapper[4898]: E0313 14:15:26.475242 4898 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 14:15:26 crc kubenswrapper[4898]: E0313 14:15:26.475330 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist podName:edfd91ee-1246-43b2-84a0-95ea069de402 nodeName:}" failed. No retries permitted until 2026-03-13 14:15:27.475312776 +0000 UTC m=+1162.476901015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist") pod "speaker-g5gqr" (UID: "edfd91ee-1246-43b2-84a0-95ea069de402") : secret "metallb-memberlist" not found Mar 13 14:15:26 crc kubenswrapper[4898]: W0313 14:15:26.578631 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604b9c21_3e85_4c2e_9faf_962f44236911.slice/crio-2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0 WatchSource:0}: Error finding container 2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0: Status 404 returned error can't find the container with id 2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0 Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.584400 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5"] Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.643201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-cx422"] Mar 13 14:15:26 crc kubenswrapper[4898]: W0313 14:15:26.651603 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb231c7db_5056_4ec6_a64c_0aa8bdff336b.slice/crio-04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35 WatchSource:0}: Error finding container 04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35: Status 404 returned error can't find the container with id 04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35 Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.897912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" event={"ID":"604b9c21-3e85-4c2e-9faf-962f44236911","Type":"ContainerStarted","Data":"2b552d780586afb0605f5b69a9f12b031dd06d6cf8adbd55a7e6ebd36992ffa0"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.898709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"f6c1181539f490967414705ffcf64a141f8d718f3fbe65ea8c899231bca82f71"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cx422" event={"ID":"b231c7db-5056-4ec6-a64c-0aa8bdff336b","Type":"ContainerStarted","Data":"8d7d216fe78f04af98d68f17acb880630f795a36fde0f4603b51086769954d5b"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cx422" event={"ID":"b231c7db-5056-4ec6-a64c-0aa8bdff336b","Type":"ContainerStarted","Data":"50c031524bb2bf8d6fa08ff899e9bd2f79f477e7c3fd71037c11b22610fc948c"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900450 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-cx422" event={"ID":"b231c7db-5056-4ec6-a64c-0aa8bdff336b","Type":"ContainerStarted","Data":"04ffe12997992c8a5b8bb20d4b14e8c303455b2a384628b905b1b8268f449f35"} Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.900533 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:26 crc kubenswrapper[4898]: I0313 14:15:26.922593 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-cx422" podStartSLOduration=1.9225746959999999 podStartE2EDuration="1.922574696s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:15:26.922057713 +0000 UTC m=+1161.923645992" watchObservedRunningTime="2026-03-13 14:15:26.922574696 +0000 UTC m=+1161.924162935" Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.493091 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.498381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/edfd91ee-1246-43b2-84a0-95ea069de402-memberlist\") pod \"speaker-g5gqr\" (UID: \"edfd91ee-1246-43b2-84a0-95ea069de402\") " pod="metallb-system/speaker-g5gqr" Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.641510 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:27 crc kubenswrapper[4898]: W0313 14:15:27.680113 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfd91ee_1246_43b2_84a0_95ea069de402.slice/crio-83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a WatchSource:0}: Error finding container 83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a: Status 404 returned error can't find the container with id 83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a Mar 13 14:15:27 crc kubenswrapper[4898]: I0313 14:15:27.909258 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g5gqr" event={"ID":"edfd91ee-1246-43b2-84a0-95ea069de402","Type":"ContainerStarted","Data":"83e4c4b5eb654c2ea9fe0d7dd5c89cc6ac265f4391f16fb24a87c7d392fac38a"} Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.926252 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g5gqr" event={"ID":"edfd91ee-1246-43b2-84a0-95ea069de402","Type":"ContainerStarted","Data":"a39b33f1d3a64233bdc731a83d8fc40daebfa1230066627092def13b04432916"} Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.926300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-g5gqr" event={"ID":"edfd91ee-1246-43b2-84a0-95ea069de402","Type":"ContainerStarted","Data":"0a6c68b1de1e0d7624616ab17ffb737c14c0199744057f8f2a77ece2db6660a6"} Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.926450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:28 crc kubenswrapper[4898]: I0313 14:15:28.946141 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-g5gqr" podStartSLOduration=3.946115828 podStartE2EDuration="3.946115828s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:15:28.940892331 +0000 UTC m=+1163.942480580" watchObservedRunningTime="2026-03-13 14:15:28.946115828 +0000 UTC m=+1163.947704067" Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.976562 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" event={"ID":"604b9c21-3e85-4c2e-9faf-962f44236911","Type":"ContainerStarted","Data":"c2e042db91577156269309aca160b7b8767ed8c465b79f1539698cd7a6652d49"} Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.977865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.978809 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="475580e194896bcb384f743508d0bda7bb9c9de88dbe8f57a8106e5e13db5a02" exitCode=0 Mar 13 14:15:34 crc kubenswrapper[4898]: I0313 14:15:34.978883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"475580e194896bcb384f743508d0bda7bb9c9de88dbe8f57a8106e5e13db5a02"} Mar 13 14:15:35 crc kubenswrapper[4898]: I0313 14:15:35.005633 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podStartSLOduration=2.381998356 podStartE2EDuration="10.005614839s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="2026-03-13 14:15:26.580664654 +0000 UTC m=+1161.582252883" lastFinishedPulling="2026-03-13 14:15:34.204281127 +0000 UTC m=+1169.205869366" observedRunningTime="2026-03-13 14:15:35.003035772 +0000 UTC m=+1170.004624021" watchObservedRunningTime="2026-03-13 14:15:35.005614839 +0000 UTC m=+1170.007203088" Mar 13 14:15:35 crc kubenswrapper[4898]: I0313 14:15:35.993223 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="479e8d77c1638a8eb14315977e43a32b6023d8fdd6daad9a116915f891125d98" exitCode=0 Mar 13 14:15:35 crc kubenswrapper[4898]: I0313 14:15:35.993290 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"479e8d77c1638a8eb14315977e43a32b6023d8fdd6daad9a116915f891125d98"} Mar 13 14:15:36 crc kubenswrapper[4898]: I0313 14:15:36.162403 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-cx422" Mar 13 14:15:37 crc kubenswrapper[4898]: I0313 14:15:37.006050 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="577e02f1140114d41ba083f8d1376f0da51fcfc7633acb7ea7637c8fd8269feb" exitCode=0 Mar 13 14:15:37 crc kubenswrapper[4898]: I0313 14:15:37.006113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"577e02f1140114d41ba083f8d1376f0da51fcfc7633acb7ea7637c8fd8269feb"} Mar 13 14:15:37 crc kubenswrapper[4898]: I0313 14:15:37.645373 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-g5gqr" Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"de801e5b6035a3dd9252ae2d1f506f270dfe0c2552670023b6eac179c4aedccb"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"7ef81ddc4c5349a073203bfca8c071431a5874e9b13e0ebd50b075b4cfc8ae59"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"7be51b5ddbec2ffddde6f1aa02f465de8d01bcf3f148bc9e5694be9c2d0a3885"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809"} Mar 13 14:15:38 crc kubenswrapper[4898]: I0313 14:15:38.017979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"5caa38ab91ff3bb9407ccdc085db3c665cadd77289d8c9655d76db3e95c56dc9"} Mar 13 14:15:39 crc kubenswrapper[4898]: I0313 14:15:39.031642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"eb8a82178822c3aac08c0234f2ad15766fba6323092ad5172a80486bc675d301"} Mar 13 14:15:39 crc kubenswrapper[4898]: I0313 14:15:39.032107 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:39 crc kubenswrapper[4898]: I0313 14:15:39.066281 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-bqmxg" podStartSLOduration=6.204330854 podStartE2EDuration="14.066258137s" podCreationTimestamp="2026-03-13 14:15:25 +0000 UTC" firstStartedPulling="2026-03-13 14:15:26.29312156 +0000 UTC m=+1161.294709799" lastFinishedPulling="2026-03-13 14:15:34.155048843 +0000 UTC m=+1169.156637082" observedRunningTime="2026-03-13 14:15:39.064412979 +0000 UTC m=+1174.066001248" watchObservedRunningTime="2026-03-13 14:15:39.066258137 +0000 UTC m=+1174.067846396" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.830640 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.832022 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.836410 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.837948 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.838948 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bp8l6" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.839258 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 14:15:40 crc kubenswrapper[4898]: I0313 14:15:40.934000 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"openstack-operator-index-cw75t\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.036185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"openstack-operator-index-cw75t\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.057107 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.077361 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"openstack-operator-index-cw75t\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.106589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.170345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:41 crc kubenswrapper[4898]: I0313 14:15:41.612184 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:42 crc kubenswrapper[4898]: I0313 14:15:42.055799 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerStarted","Data":"d1eb15a6f8d2097c70c293cd2b58d045d8fb2028cb2810ed0aa67bff167518c8"} Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.198112 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.816466 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-9k7p6"] Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.818979 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:44 crc kubenswrapper[4898]: I0313 14:15:44.823044 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9k7p6"] Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.012624 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pvs\" (UniqueName: \"kubernetes.io/projected/478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8-kube-api-access-v7pvs\") pod \"openstack-operator-index-9k7p6\" (UID: \"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8\") " pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.114799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pvs\" (UniqueName: \"kubernetes.io/projected/478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8-kube-api-access-v7pvs\") pod \"openstack-operator-index-9k7p6\" (UID: \"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8\") " pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.132973 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pvs\" (UniqueName: \"kubernetes.io/projected/478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8-kube-api-access-v7pvs\") pod \"openstack-operator-index-9k7p6\" (UID: \"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8\") " pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.151579 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:45 crc kubenswrapper[4898]: I0313 14:15:45.599206 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-9k7p6"] Mar 13 14:15:45 crc kubenswrapper[4898]: W0313 14:15:45.792737 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod478795f5_c2f6_4e9b_9ed6_e2c743c3f3b8.slice/crio-563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c WatchSource:0}: Error finding container 563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c: Status 404 returned error can't find the container with id 563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c Mar 13 14:15:46 crc kubenswrapper[4898]: I0313 14:15:46.095008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9k7p6" event={"ID":"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8","Type":"ContainerStarted","Data":"563f2abeaf9a8290bd4eaa6a51c29469074d221de340d7876336cf384f67b85c"} Mar 13 14:15:46 crc kubenswrapper[4898]: I0313 14:15:46.095499 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" Mar 13 14:15:49 crc kubenswrapper[4898]: I0313 14:15:49.134666 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:15:49 crc kubenswrapper[4898]: I0313 14:15:49.135534 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.134147 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-9k7p6" event={"ID":"478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8","Type":"ContainerStarted","Data":"9d49c37f8ba27635a927bfd693d7763d0844726f817979fc7574c7aec133f0d7"} Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.136090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerStarted","Data":"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c"} Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.136169 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-cw75t" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" containerID="cri-o://f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" gracePeriod=2 Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.152370 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-9k7p6" podStartSLOduration=2.993428846 podStartE2EDuration="7.152353057s" podCreationTimestamp="2026-03-13 14:15:44 +0000 UTC" firstStartedPulling="2026-03-13 14:15:45.795032638 +0000 UTC m=+1180.796620877" lastFinishedPulling="2026-03-13 14:15:49.953956849 +0000 UTC m=+1184.955545088" observedRunningTime="2026-03-13 14:15:51.145751965 +0000 UTC m=+1186.147340224" watchObservedRunningTime="2026-03-13 14:15:51.152353057 +0000 UTC m=+1186.153941296" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.164607 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cw75t" podStartSLOduration=2.839843315 podStartE2EDuration="11.164591756s" podCreationTimestamp="2026-03-13 14:15:40 +0000 UTC" firstStartedPulling="2026-03-13 14:15:41.622598656 +0000 UTC m=+1176.624186895" lastFinishedPulling="2026-03-13 14:15:49.947347077 +0000 UTC m=+1184.948935336" observedRunningTime="2026-03-13 14:15:51.162111842 +0000 UTC m=+1186.163700101" watchObservedRunningTime="2026-03-13 14:15:51.164591756 +0000 UTC m=+1186.166179995" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.170407 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.556841 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.651562 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") pod \"a3872ed6-e59e-42fe-a774-c457f7118f65\" (UID: \"a3872ed6-e59e-42fe-a774-c457f7118f65\") " Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.657547 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5" (OuterVolumeSpecName: "kube-api-access-znkw5") pod "a3872ed6-e59e-42fe-a774-c457f7118f65" (UID: "a3872ed6-e59e-42fe-a774-c457f7118f65"). InnerVolumeSpecName "kube-api-access-znkw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:15:51 crc kubenswrapper[4898]: I0313 14:15:51.753267 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znkw5\" (UniqueName: \"kubernetes.io/projected/a3872ed6-e59e-42fe-a774-c457f7118f65-kube-api-access-znkw5\") on node \"crc\" DevicePath \"\"" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.153949 4898 generic.go:334] "Generic (PLEG): container finished" podID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" exitCode=0 Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154025 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cw75t" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerDied","Data":"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c"} Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cw75t" event={"ID":"a3872ed6-e59e-42fe-a774-c457f7118f65","Type":"ContainerDied","Data":"d1eb15a6f8d2097c70c293cd2b58d045d8fb2028cb2810ed0aa67bff167518c8"} Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.154155 4898 scope.go:117] "RemoveContainer" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.177789 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.179767 4898 scope.go:117] "RemoveContainer" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" Mar 13 14:15:52 crc kubenswrapper[4898]: E0313 14:15:52.180112 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c\": container with ID starting with f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c not found: ID does not exist" containerID="f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.180139 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c"} err="failed to get container status \"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c\": rpc error: code = NotFound desc = could not find container \"f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c\": container with ID starting with f9a5c15b28601168b14e254be6882840ca06abb1d99a713f2c511c51886a068c not found: ID does not exist" Mar 13 14:15:52 crc kubenswrapper[4898]: I0313 14:15:52.183564 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-cw75t"] Mar 13 14:15:53 crc kubenswrapper[4898]: I0313 14:15:53.756046 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" path="/var/lib/kubelet/pods/a3872ed6-e59e-42fe-a774-c457f7118f65/volumes" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.152520 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.152596 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.208368 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:55 crc kubenswrapper[4898]: I0313 14:15:55.239677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-9k7p6" Mar 13 14:15:56 crc kubenswrapper[4898]: I0313 14:15:56.060250 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-bqmxg" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.128331 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:16:00 crc kubenswrapper[4898]: E0313 14:16:00.129281 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.129296 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.129469 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3872ed6-e59e-42fe-a774-c457f7118f65" containerName="registry-server" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.130069 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.132314 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.132317 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.132439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.135858 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.187923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"auto-csr-approver-29556856-z8p4g\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.289490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"auto-csr-approver-29556856-z8p4g\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.315343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"auto-csr-approver-29556856-z8p4g\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.462919 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:00 crc kubenswrapper[4898]: I0313 14:16:00.928230 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:16:00 crc kubenswrapper[4898]: W0313 14:16:00.931618 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b81468c_e1ac_4515_837d_993e3c5108c9.slice/crio-7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6 WatchSource:0}: Error finding container 7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6: Status 404 returned error can't find the container with id 7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6 Mar 13 14:16:01 crc kubenswrapper[4898]: I0313 14:16:01.220337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerStarted","Data":"7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6"} Mar 13 14:16:02 crc kubenswrapper[4898]: I0313 14:16:02.230726 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerStarted","Data":"309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac"} Mar 13 14:16:02 crc kubenswrapper[4898]: I0313 14:16:02.243128 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" podStartSLOduration=1.205896884 podStartE2EDuration="2.243110647s" podCreationTimestamp="2026-03-13 14:16:00 +0000 UTC" firstStartedPulling="2026-03-13 14:16:00.933726654 +0000 UTC m=+1195.935314893" lastFinishedPulling="2026-03-13 14:16:01.970940417 +0000 UTC m=+1196.972528656" observedRunningTime="2026-03-13 14:16:02.241633748 +0000 UTC m=+1197.243221987" watchObservedRunningTime="2026-03-13 14:16:02.243110647 +0000 UTC m=+1197.244698896" Mar 13 14:16:03 crc kubenswrapper[4898]: I0313 14:16:03.241297 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerID="309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac" exitCode=0 Mar 13 14:16:03 crc kubenswrapper[4898]: I0313 14:16:03.241412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerDied","Data":"309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac"} Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.629169 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.662392 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") pod \"5b81468c-e1ac-4515-837d-993e3c5108c9\" (UID: \"5b81468c-e1ac-4515-837d-993e3c5108c9\") " Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.668772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp" (OuterVolumeSpecName: "kube-api-access-wclsp") pod "5b81468c-e1ac-4515-837d-993e3c5108c9" (UID: "5b81468c-e1ac-4515-837d-993e3c5108c9"). InnerVolumeSpecName "kube-api-access-wclsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:16:04 crc kubenswrapper[4898]: I0313 14:16:04.765501 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wclsp\" (UniqueName: \"kubernetes.io/projected/5b81468c-e1ac-4515-837d-993e3c5108c9-kube-api-access-wclsp\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.260773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" event={"ID":"5b81468c-e1ac-4515-837d-993e3c5108c9","Type":"ContainerDied","Data":"7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6"} Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.261149 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7aa59f8c94fe7e99e47923bf7d8d3fd5b746af74ca704c75a198f4d8a65fd6" Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.260860 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556856-z8p4g" Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.298595 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.304050 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556850-n2t8z"] Mar 13 14:16:05 crc kubenswrapper[4898]: I0313 14:16:05.753453 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05c0334-f9cf-4640-a763-6d77b983193c" path="/var/lib/kubelet/pods/a05c0334-f9cf-4640-a763-6d77b983193c/volumes" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.291615 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57"] Mar 13 14:16:11 crc kubenswrapper[4898]: E0313 14:16:11.292630 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerName="oc" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.292646 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerName="oc" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.292852 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" containerName="oc" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.294314 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.297503 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6x4vb" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.320068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57"] Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.408399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.408983 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.409149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.510550 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.510686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.510761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.511700 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.511756 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.538363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:11 crc kubenswrapper[4898]: I0313 14:16:11.641409 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:12 crc kubenswrapper[4898]: I0313 14:16:12.124147 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57"] Mar 13 14:16:12 crc kubenswrapper[4898]: I0313 14:16:12.342226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerStarted","Data":"de0737319c24d387d36033ef3d7fde262ae7e58a7a64d518eb94a500b00ccd7c"} Mar 13 14:16:12 crc kubenswrapper[4898]: I0313 14:16:12.342276 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerStarted","Data":"edd3fbaa00c13c87b979dfe05e5d0b77bdb33d430bf8f77a8a32ca9cb4123a2b"} Mar 13 14:16:13 crc kubenswrapper[4898]: I0313 14:16:13.351925 4898 generic.go:334] "Generic (PLEG): container finished" podID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerID="de0737319c24d387d36033ef3d7fde262ae7e58a7a64d518eb94a500b00ccd7c" exitCode=0 Mar 13 14:16:13 crc kubenswrapper[4898]: I0313 14:16:13.352105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"de0737319c24d387d36033ef3d7fde262ae7e58a7a64d518eb94a500b00ccd7c"} Mar 13 14:16:14 crc kubenswrapper[4898]: I0313 14:16:14.360462 4898 generic.go:334] "Generic (PLEG): container finished" podID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerID="19e101b600e89585fb9349651c696287b7553b5949128b2de6653d748f58d321" exitCode=0 Mar 13 14:16:14 crc kubenswrapper[4898]: I0313 14:16:14.360519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"19e101b600e89585fb9349651c696287b7553b5949128b2de6653d748f58d321"} Mar 13 14:16:15 crc kubenswrapper[4898]: I0313 14:16:15.378671 4898 generic.go:334] "Generic (PLEG): container finished" podID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerID="9cf0186ece602fa99b331b4c0f59fabd6a8d9ecd8f719509581cd4626f43f447" exitCode=0 Mar 13 14:16:15 crc kubenswrapper[4898]: I0313 14:16:15.378729 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"9cf0186ece602fa99b331b4c0f59fabd6a8d9ecd8f719509581cd4626f43f447"} Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.649120 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.804436 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") pod \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.804566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") pod \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.804795 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") pod \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\" (UID: \"a57932fc-ce83-4258-95a8-65f29c0cfd5a\") " Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.806006 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle" (OuterVolumeSpecName: "bundle") pod "a57932fc-ce83-4258-95a8-65f29c0cfd5a" (UID: "a57932fc-ce83-4258-95a8-65f29c0cfd5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.807025 4898 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.812649 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr" (OuterVolumeSpecName: "kube-api-access-xjqpr") pod "a57932fc-ce83-4258-95a8-65f29c0cfd5a" (UID: "a57932fc-ce83-4258-95a8-65f29c0cfd5a"). InnerVolumeSpecName "kube-api-access-xjqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.821347 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util" (OuterVolumeSpecName: "util") pod "a57932fc-ce83-4258-95a8-65f29c0cfd5a" (UID: "a57932fc-ce83-4258-95a8-65f29c0cfd5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.908877 4898 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a57932fc-ce83-4258-95a8-65f29c0cfd5a-util\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:16 crc kubenswrapper[4898]: I0313 14:16:16.908936 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjqpr\" (UniqueName: \"kubernetes.io/projected/a57932fc-ce83-4258-95a8-65f29c0cfd5a-kube-api-access-xjqpr\") on node \"crc\" DevicePath \"\"" Mar 13 14:16:17 crc kubenswrapper[4898]: I0313 14:16:17.403796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" event={"ID":"a57932fc-ce83-4258-95a8-65f29c0cfd5a","Type":"ContainerDied","Data":"edd3fbaa00c13c87b979dfe05e5d0b77bdb33d430bf8f77a8a32ca9cb4123a2b"} Mar 13 14:16:17 crc kubenswrapper[4898]: I0313 14:16:17.403843 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edd3fbaa00c13c87b979dfe05e5d0b77bdb33d430bf8f77a8a32ca9cb4123a2b" Mar 13 14:16:17 crc kubenswrapper[4898]: I0313 14:16:17.403993 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.134974 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.135344 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.135406 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.136194 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.136266 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed" gracePeriod=600 Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.423027 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed" exitCode=0 Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.423096 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed"} Mar 13 14:16:19 crc kubenswrapper[4898]: I0313 14:16:19.423137 4898 scope.go:117] "RemoveContainer" containerID="b58828da596890620679d1e69bfdfd0b7cd0cf06254eed4031e215964351d8c6" Mar 13 14:16:20 crc kubenswrapper[4898]: I0313 14:16:20.437763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56"} Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.268186 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn"] Mar 13 14:16:24 crc kubenswrapper[4898]: E0313 14:16:24.269084 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="pull" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269098 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="pull" Mar 13 14:16:24 crc kubenswrapper[4898]: E0313 14:16:24.269121 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="extract" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269127 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="extract" Mar 13 14:16:24 crc kubenswrapper[4898]: E0313 14:16:24.269136 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="util" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269141 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="util" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269309 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57932fc-ce83-4258-95a8-65f29c0cfd5a" containerName="extract" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.269869 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.279320 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-6qc9g" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.308863 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn"] Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.459836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x57z\" (UniqueName: \"kubernetes.io/projected/7bae49ab-1146-43a2-b436-69838c923f1a-kube-api-access-2x57z\") pod \"openstack-operator-controller-init-6b8c6b5df9-kk2gn\" (UID: \"7bae49ab-1146-43a2-b436-69838c923f1a\") " pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.562756 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x57z\" (UniqueName: \"kubernetes.io/projected/7bae49ab-1146-43a2-b436-69838c923f1a-kube-api-access-2x57z\") pod \"openstack-operator-controller-init-6b8c6b5df9-kk2gn\" (UID: \"7bae49ab-1146-43a2-b436-69838c923f1a\") " pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.583786 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x57z\" (UniqueName: \"kubernetes.io/projected/7bae49ab-1146-43a2-b436-69838c923f1a-kube-api-access-2x57z\") pod \"openstack-operator-controller-init-6b8c6b5df9-kk2gn\" (UID: \"7bae49ab-1146-43a2-b436-69838c923f1a\") " pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:24 crc kubenswrapper[4898]: I0313 14:16:24.604259 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:25 crc kubenswrapper[4898]: I0313 14:16:25.136763 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn"] Mar 13 14:16:25 crc kubenswrapper[4898]: I0313 14:16:25.485703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" event={"ID":"7bae49ab-1146-43a2-b436-69838c923f1a","Type":"ContainerStarted","Data":"80f3b0f287b2875fcd6c6e0a40bc6e8aa4408ca1f5d0f62be3214f49a50f3e18"} Mar 13 14:16:29 crc kubenswrapper[4898]: I0313 14:16:29.519481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" event={"ID":"7bae49ab-1146-43a2-b436-69838c923f1a","Type":"ContainerStarted","Data":"d87599acaa104726169cfe37672c0e6b8881bc6eb72714faae25c237e35b25d7"} Mar 13 14:16:29 crc kubenswrapper[4898]: I0313 14:16:29.520981 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:16:29 crc kubenswrapper[4898]: I0313 14:16:29.561949 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podStartSLOduration=1.975188419 podStartE2EDuration="5.561929275s" podCreationTimestamp="2026-03-13 14:16:24 +0000 UTC" firstStartedPulling="2026-03-13 14:16:25.144314836 +0000 UTC m=+1220.145903075" lastFinishedPulling="2026-03-13 14:16:28.731055692 +0000 UTC m=+1223.732643931" observedRunningTime="2026-03-13 14:16:29.560041205 +0000 UTC m=+1224.561629464" watchObservedRunningTime="2026-03-13 14:16:29.561929275 +0000 UTC m=+1224.563517514" Mar 13 14:16:32 crc kubenswrapper[4898]: I0313 14:16:32.735669 4898 scope.go:117] "RemoveContainer" containerID="b0888e4d135b3b37fbe96fe16a03f870ba37d4188b89aa723dcdb2298a0e4ed8" Mar 13 14:16:34 crc kubenswrapper[4898]: I0313 14:16:34.608480 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.648690 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-gtlps"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.650814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.654402 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-vp785" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.664785 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.665930 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.667838 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-nrczs" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.670309 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-gtlps"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.681414 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.682446 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.685273 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-zjpd9" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.701693 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.737416 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.765246 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.766245 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.770702 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9fmzd" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.813105 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.814471 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.816159 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l2jg\" (UniqueName: \"kubernetes.io/projected/0d88a5d2-a852-409e-b4bd-939d1c2b9090-kube-api-access-4l2jg\") pod \"cinder-operator-controller-manager-984cd4dcf-p9d5v\" (UID: \"0d88a5d2-a852-409e-b4bd-939d1c2b9090\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.816221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5rn7\" (UniqueName: \"kubernetes.io/projected/3c955ebc-98fd-4921-9923-6151a50e8eec-kube-api-access-g5rn7\") pod \"designate-operator-controller-manager-66d56f6ff4-4n5rx\" (UID: \"3c955ebc-98fd-4921-9923-6151a50e8eec\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.816358 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7pb\" (UniqueName: \"kubernetes.io/projected/45efd8ce-26db-4511-bd88-2e7467d02bbb-kube-api-access-xm7pb\") pod \"barbican-operator-controller-manager-d47688694-gtlps\" (UID: \"45efd8ce-26db-4511-bd88-2e7467d02bbb\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.817442 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2zg6t" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.821207 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.837834 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.877307 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.878676 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.886006 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9kvzj" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.892153 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930394 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l2jg\" (UniqueName: \"kubernetes.io/projected/0d88a5d2-a852-409e-b4bd-939d1c2b9090-kube-api-access-4l2jg\") pod \"cinder-operator-controller-manager-984cd4dcf-p9d5v\" (UID: \"0d88a5d2-a852-409e-b4bd-939d1c2b9090\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930538 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5rn7\" (UniqueName: \"kubernetes.io/projected/3c955ebc-98fd-4921-9923-6151a50e8eec-kube-api-access-g5rn7\") pod \"designate-operator-controller-manager-66d56f6ff4-4n5rx\" (UID: \"3c955ebc-98fd-4921-9923-6151a50e8eec\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930830 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7pb\" (UniqueName: \"kubernetes.io/projected/45efd8ce-26db-4511-bd88-2e7467d02bbb-kube-api-access-xm7pb\") pod \"barbican-operator-controller-manager-d47688694-gtlps\" (UID: \"45efd8ce-26db-4511-bd88-2e7467d02bbb\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.930962 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r94zq\" (UniqueName: \"kubernetes.io/projected/fb7b2f97-fca8-41d2-9be7-d40fac94c171-kube-api-access-r94zq\") pod \"glance-operator-controller-manager-5964f64c48-mf8h6\" (UID: \"fb7b2f97-fca8-41d2-9be7-d40fac94c171\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.931033 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlhk\" (UniqueName: \"kubernetes.io/projected/ea0ad033-9a48-4e42-a237-f27cacf03adc-kube-api-access-8xlhk\") pod \"heat-operator-controller-manager-77b6666d85-tqp4b\" (UID: \"ea0ad033-9a48-4e42-a237-f27cacf03adc\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.951771 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw"] Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.953431 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.970481 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j4b7h" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.987163 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.988281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7pb\" (UniqueName: \"kubernetes.io/projected/45efd8ce-26db-4511-bd88-2e7467d02bbb-kube-api-access-xm7pb\") pod \"barbican-operator-controller-manager-d47688694-gtlps\" (UID: \"45efd8ce-26db-4511-bd88-2e7467d02bbb\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:02 crc kubenswrapper[4898]: I0313 14:17:02.988758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5rn7\" (UniqueName: \"kubernetes.io/projected/3c955ebc-98fd-4921-9923-6151a50e8eec-kube-api-access-g5rn7\") pod \"designate-operator-controller-manager-66d56f6ff4-4n5rx\" (UID: \"3c955ebc-98fd-4921-9923-6151a50e8eec\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.003324 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l2jg\" (UniqueName: \"kubernetes.io/projected/0d88a5d2-a852-409e-b4bd-939d1c2b9090-kube-api-access-4l2jg\") pod \"cinder-operator-controller-manager-984cd4dcf-p9d5v\" (UID: \"0d88a5d2-a852-409e-b4bd-939d1c2b9090\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnd7\" (UniqueName: \"kubernetes.io/projected/c35de09d-7f21-47d3-aac5-a26b15b0a496-kube-api-access-7hnd7\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032144 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032189 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r94zq\" (UniqueName: \"kubernetes.io/projected/fb7b2f97-fca8-41d2-9be7-d40fac94c171-kube-api-access-r94zq\") pod \"glance-operator-controller-manager-5964f64c48-mf8h6\" (UID: \"fb7b2f97-fca8-41d2-9be7-d40fac94c171\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032207 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbcwc\" (UniqueName: \"kubernetes.io/projected/a80d01d5-0201-4b2e-974c-ac5b42ac8df4-kube-api-access-jbcwc\") pod \"horizon-operator-controller-manager-6d9d6b584d-jngrl\" (UID: \"a80d01d5-0201-4b2e-974c-ac5b42ac8df4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlhk\" (UniqueName: \"kubernetes.io/projected/ea0ad033-9a48-4e42-a237-f27cacf03adc-kube-api-access-8xlhk\") pod \"heat-operator-controller-manager-77b6666d85-tqp4b\" (UID: \"ea0ad033-9a48-4e42-a237-f27cacf03adc\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.032848 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.033175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.039097 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.042797 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.046799 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.049283 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-z4khd" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.071461 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.076614 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r94zq\" (UniqueName: \"kubernetes.io/projected/fb7b2f97-fca8-41d2-9be7-d40fac94c171-kube-api-access-r94zq\") pod \"glance-operator-controller-manager-5964f64c48-mf8h6\" (UID: \"fb7b2f97-fca8-41d2-9be7-d40fac94c171\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.082170 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.084454 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlhk\" (UniqueName: \"kubernetes.io/projected/ea0ad033-9a48-4e42-a237-f27cacf03adc-kube-api-access-8xlhk\") pod \"heat-operator-controller-manager-77b6666d85-tqp4b\" (UID: \"ea0ad033-9a48-4e42-a237-f27cacf03adc\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.093229 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.094258 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.097188 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-s92xm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.097705 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.107821 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.109204 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.115812 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lzkkk" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.126650 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.129223 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.131850 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-z2klm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.137323 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.137988 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnd7\" (UniqueName: \"kubernetes.io/projected/c35de09d-7f21-47d3-aac5-a26b15b0a496-kube-api-access-7hnd7\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.138592 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.138649 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbcwc\" (UniqueName: \"kubernetes.io/projected/a80d01d5-0201-4b2e-974c-ac5b42ac8df4-kube-api-access-jbcwc\") pod \"horizon-operator-controller-manager-6d9d6b584d-jngrl\" (UID: \"a80d01d5-0201-4b2e-974c-ac5b42ac8df4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.139003 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.139051 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:03.639035094 +0000 UTC m=+1258.640623333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.145202 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.157234 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbcwc\" (UniqueName: \"kubernetes.io/projected/a80d01d5-0201-4b2e-974c-ac5b42ac8df4-kube-api-access-jbcwc\") pod \"horizon-operator-controller-manager-6d9d6b584d-jngrl\" (UID: \"a80d01d5-0201-4b2e-974c-ac5b42ac8df4\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.161828 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnd7\" (UniqueName: \"kubernetes.io/projected/c35de09d-7f21-47d3-aac5-a26b15b0a496-kube-api-access-7hnd7\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.165323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.194078 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.195547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.199107 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-46qzh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.207411 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.230173 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.231339 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.239342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-hh82n" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghts\" (UniqueName: \"kubernetes.io/projected/32b5ebfd-38d9-456e-bb21-7332323239d1-kube-api-access-9ghts\") pod \"ironic-operator-controller-manager-5bc894d9b-v99bm\" (UID: \"32b5ebfd-38d9-456e-bb21-7332323239d1\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59lv\" (UniqueName: \"kubernetes.io/projected/d24bb749-0b71-456b-80e4-fdf6dd23ba30-kube-api-access-c59lv\") pod \"keystone-operator-controller-manager-684f77d66d-s5zh6\" (UID: \"d24bb749-0b71-456b-80e4-fdf6dd23ba30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q57rr\" (UniqueName: \"kubernetes.io/projected/ba56f415-73d5-4301-a25d-0e5d1ba4e3b1-kube-api-access-q57rr\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc\" (UID: \"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.241224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvxx\" (UniqueName: \"kubernetes.io/projected/1df4a7d6-b0c2-4b00-b591-1a612bd319b6-kube-api-access-wnvxx\") pod \"manila-operator-controller-manager-57b484b4df-z2gd2\" (UID: \"1df4a7d6-b0c2-4b00-b591-1a612bd319b6\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.263425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.282045 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.287745 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.336471 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343282 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59lv\" (UniqueName: \"kubernetes.io/projected/d24bb749-0b71-456b-80e4-fdf6dd23ba30-kube-api-access-c59lv\") pod \"keystone-operator-controller-manager-684f77d66d-s5zh6\" (UID: \"d24bb749-0b71-456b-80e4-fdf6dd23ba30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q57rr\" (UniqueName: \"kubernetes.io/projected/ba56f415-73d5-4301-a25d-0e5d1ba4e3b1-kube-api-access-q57rr\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc\" (UID: \"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvxx\" (UniqueName: \"kubernetes.io/projected/1df4a7d6-b0c2-4b00-b591-1a612bd319b6-kube-api-access-wnvxx\") pod \"manila-operator-controller-manager-57b484b4df-z2gd2\" (UID: \"1df4a7d6-b0c2-4b00-b591-1a612bd319b6\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343470 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbj2m\" (UniqueName: \"kubernetes.io/projected/52959483-daae-423a-a3bf-8e3fa7810074-kube-api-access-wbj2m\") pod \"nova-operator-controller-manager-7f84474648-mr4wv\" (UID: \"52959483-daae-423a-a3bf-8e3fa7810074\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343523 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btrh6\" (UniqueName: \"kubernetes.io/projected/d71982c0-a3d0-4da8-84cd-7494301f589f-kube-api-access-btrh6\") pod \"neutron-operator-controller-manager-776c5696bf-ntlw6\" (UID: \"d71982c0-a3d0-4da8-84cd-7494301f589f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.343550 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghts\" (UniqueName: \"kubernetes.io/projected/32b5ebfd-38d9-456e-bb21-7332323239d1-kube-api-access-9ghts\") pod \"ironic-operator-controller-manager-5bc894d9b-v99bm\" (UID: \"32b5ebfd-38d9-456e-bb21-7332323239d1\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.344962 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.345641 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.347090 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-ldprr" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.398798 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.401936 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.404669 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-x5ln5" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.418163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghts\" (UniqueName: \"kubernetes.io/projected/32b5ebfd-38d9-456e-bb21-7332323239d1-kube-api-access-9ghts\") pod \"ironic-operator-controller-manager-5bc894d9b-v99bm\" (UID: \"32b5ebfd-38d9-456e-bb21-7332323239d1\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.425029 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.435125 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q57rr\" (UniqueName: \"kubernetes.io/projected/ba56f415-73d5-4301-a25d-0e5d1ba4e3b1-kube-api-access-q57rr\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc\" (UID: \"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.435490 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvxx\" (UniqueName: \"kubernetes.io/projected/1df4a7d6-b0c2-4b00-b591-1a612bd319b6-kube-api-access-wnvxx\") pod \"manila-operator-controller-manager-57b484b4df-z2gd2\" (UID: \"1df4a7d6-b0c2-4b00-b591-1a612bd319b6\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.436066 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59lv\" (UniqueName: \"kubernetes.io/projected/d24bb749-0b71-456b-80e4-fdf6dd23ba30-kube-api-access-c59lv\") pod \"keystone-operator-controller-manager-684f77d66d-s5zh6\" (UID: \"d24bb749-0b71-456b-80e4-fdf6dd23ba30\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.440209 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.443968 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.450877 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ddhpq" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.456662 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gjb\" (UniqueName: \"kubernetes.io/projected/d29ce3ee-3d5a-4801-abf9-dfef5b641a74-kube-api-access-52gjb\") pod \"octavia-operator-controller-manager-5f4f55cb5c-s2rdh\" (UID: \"d29ce3ee-3d5a-4801-abf9-dfef5b641a74\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.457008 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbj2m\" (UniqueName: \"kubernetes.io/projected/52959483-daae-423a-a3bf-8e3fa7810074-kube-api-access-wbj2m\") pod \"nova-operator-controller-manager-7f84474648-mr4wv\" (UID: \"52959483-daae-423a-a3bf-8e3fa7810074\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.457241 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btrh6\" (UniqueName: \"kubernetes.io/projected/d71982c0-a3d0-4da8-84cd-7494301f589f-kube-api-access-btrh6\") pod \"neutron-operator-controller-manager-776c5696bf-ntlw6\" (UID: \"d71982c0-a3d0-4da8-84cd-7494301f589f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.458957 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.460527 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.463726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.466347 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-njg9g" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.478216 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.486932 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.489387 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbj2m\" (UniqueName: \"kubernetes.io/projected/52959483-daae-423a-a3bf-8e3fa7810074-kube-api-access-wbj2m\") pod \"nova-operator-controller-manager-7f84474648-mr4wv\" (UID: \"52959483-daae-423a-a3bf-8e3fa7810074\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.492718 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.493740 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btrh6\" (UniqueName: \"kubernetes.io/projected/d71982c0-a3d0-4da8-84cd-7494301f589f-kube-api-access-btrh6\") pod \"neutron-operator-controller-manager-776c5696bf-ntlw6\" (UID: \"d71982c0-a3d0-4da8-84cd-7494301f589f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.494524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.500931 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.501439 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-dn2hz" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.505012 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.525071 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.527652 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.536166 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.542711 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.545819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.555846 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.557392 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.558989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2ck\" (UniqueName: \"kubernetes.io/projected/da3795a7-363f-4637-afe2-77cb77248f9a-kube-api-access-qb2ck\") pod \"ovn-operator-controller-manager-bbc5b68f9-wdmrh\" (UID: \"da3795a7-363f-4637-afe2-77cb77248f9a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559087 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559181 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdqh\" (UniqueName: \"kubernetes.io/projected/0ab852e1-fd26-4f76-b758-77896f8e236b-kube-api-access-7tdqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559227 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gjb\" (UniqueName: \"kubernetes.io/projected/d29ce3ee-3d5a-4801-abf9-dfef5b641a74-kube-api-access-52gjb\") pod \"octavia-operator-controller-manager-5f4f55cb5c-s2rdh\" (UID: \"d29ce3ee-3d5a-4801-abf9-dfef5b641a74\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.559249 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkdwz\" (UniqueName: \"kubernetes.io/projected/0d7c657b-a701-41fe-9b23-d5bba3302c4f-kube-api-access-dkdwz\") pod \"placement-operator-controller-manager-574d45c66c-njsvh\" (UID: \"0d7c657b-a701-41fe-9b23-d5bba3302c4f\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.565721 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gvvbw" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.615800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gjb\" (UniqueName: \"kubernetes.io/projected/d29ce3ee-3d5a-4801-abf9-dfef5b641a74-kube-api-access-52gjb\") pod \"octavia-operator-controller-manager-5f4f55cb5c-s2rdh\" (UID: \"d29ce3ee-3d5a-4801-abf9-dfef5b641a74\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.616654 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.661590 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggn24\" (UniqueName: \"kubernetes.io/projected/9ff6f89a-7110-42fb-96b9-8611f280bebe-kube-api-access-ggn24\") pod \"telemetry-operator-controller-manager-5b9fbd87f-s2k96\" (UID: \"9ff6f89a-7110-42fb-96b9-8611f280bebe\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.661642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbspp\" (UniqueName: \"kubernetes.io/projected/66a86c31-9ff3-439a-a0f8-96c981014b6f-kube-api-access-kbspp\") pod \"swift-operator-controller-manager-7f9cc5dd44-f2t6t\" (UID: \"66a86c31-9ff3-439a-a0f8-96c981014b6f\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.661667 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662230 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662326 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662437 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdqh\" (UniqueName: \"kubernetes.io/projected/0ab852e1-fd26-4f76-b758-77896f8e236b-kube-api-access-7tdqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662444 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.662427386 +0000 UTC m=+1259.664015625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662539 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkdwz\" (UniqueName: \"kubernetes.io/projected/0d7c657b-a701-41fe-9b23-d5bba3302c4f-kube-api-access-dkdwz\") pod \"placement-operator-controller-manager-574d45c66c-njsvh\" (UID: \"0d7c657b-a701-41fe-9b23-d5bba3302c4f\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.662639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2ck\" (UniqueName: \"kubernetes.io/projected/da3795a7-363f-4637-afe2-77cb77248f9a-kube-api-access-qb2ck\") pod \"ovn-operator-controller-manager-bbc5b68f9-wdmrh\" (UID: \"da3795a7-363f-4637-afe2-77cb77248f9a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662785 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: E0313 14:17:03.662813 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.162805676 +0000 UTC m=+1259.164393915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.688702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2ck\" (UniqueName: \"kubernetes.io/projected/da3795a7-363f-4637-afe2-77cb77248f9a-kube-api-access-qb2ck\") pod \"ovn-operator-controller-manager-bbc5b68f9-wdmrh\" (UID: \"da3795a7-363f-4637-afe2-77cb77248f9a\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.689798 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdqh\" (UniqueName: \"kubernetes.io/projected/0ab852e1-fd26-4f76-b758-77896f8e236b-kube-api-access-7tdqh\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.694005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkdwz\" (UniqueName: \"kubernetes.io/projected/0d7c657b-a701-41fe-9b23-d5bba3302c4f-kube-api-access-dkdwz\") pod \"placement-operator-controller-manager-574d45c66c-njsvh\" (UID: \"0d7c657b-a701-41fe-9b23-d5bba3302c4f\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.694472 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.695484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.698442 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-hd4hf" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.709667 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.757782 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.759659 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.764228 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gnhm7" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.765126 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggn24\" (UniqueName: \"kubernetes.io/projected/9ff6f89a-7110-42fb-96b9-8611f280bebe-kube-api-access-ggn24\") pod \"telemetry-operator-controller-manager-5b9fbd87f-s2k96\" (UID: \"9ff6f89a-7110-42fb-96b9-8611f280bebe\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.765176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbspp\" (UniqueName: \"kubernetes.io/projected/66a86c31-9ff3-439a-a0f8-96c981014b6f-kube-api-access-kbspp\") pod \"swift-operator-controller-manager-7f9cc5dd44-f2t6t\" (UID: \"66a86c31-9ff3-439a-a0f8-96c981014b6f\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.786735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggn24\" (UniqueName: \"kubernetes.io/projected/9ff6f89a-7110-42fb-96b9-8611f280bebe-kube-api-access-ggn24\") pod \"telemetry-operator-controller-manager-5b9fbd87f-s2k96\" (UID: \"9ff6f89a-7110-42fb-96b9-8611f280bebe\") " pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.791102 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbspp\" (UniqueName: \"kubernetes.io/projected/66a86c31-9ff3-439a-a0f8-96c981014b6f-kube-api-access-kbspp\") pod \"swift-operator-controller-manager-7f9cc5dd44-f2t6t\" (UID: \"66a86c31-9ff3-439a-a0f8-96c981014b6f\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.804019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.830595 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.832192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.840914 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.843026 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.843254 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nrmkf" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.859582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.868198 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz8wp\" (UniqueName: \"kubernetes.io/projected/19a0f4de-5258-4f2b-9587-71293459378e-kube-api-access-rz8wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-smdkt\" (UID: \"19a0f4de-5258-4f2b-9587-71293459378e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.868498 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjvw\" (UniqueName: \"kubernetes.io/projected/919747b8-a031-4654-999f-3c3928f981b4-kube-api-access-xvjvw\") pod \"watcher-operator-controller-manager-6c4d75f7f9-jwrd2\" (UID: \"919747b8-a031-4654-999f-3c3928f981b4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.880523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.902410 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.904168 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.909410 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-j47dn" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.916037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc"] Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.928971 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.951298 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.970547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.974815 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjvw\" (UniqueName: \"kubernetes.io/projected/919747b8-a031-4654-999f-3c3928f981b4-kube-api-access-xvjvw\") pod \"watcher-operator-controller-manager-6c4d75f7f9-jwrd2\" (UID: \"919747b8-a031-4654-999f-3c3928f981b4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.975009 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbxx\" (UniqueName: \"kubernetes.io/projected/3a26728d-85c2-465c-bce4-c74045ea9e0d-kube-api-access-jdbxx\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.975037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.976287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz8wp\" (UniqueName: \"kubernetes.io/projected/19a0f4de-5258-4f2b-9587-71293459378e-kube-api-access-rz8wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-smdkt\" (UID: \"19a0f4de-5258-4f2b-9587-71293459378e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.976340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:03 crc kubenswrapper[4898]: I0313 14:17:03.995484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.008055 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz8wp\" (UniqueName: \"kubernetes.io/projected/19a0f4de-5258-4f2b-9587-71293459378e-kube-api-access-rz8wp\") pod \"test-operator-controller-manager-5c5cb9c4d7-smdkt\" (UID: \"19a0f4de-5258-4f2b-9587-71293459378e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.011059 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjvw\" (UniqueName: \"kubernetes.io/projected/919747b8-a031-4654-999f-3c3928f981b4-kube-api-access-xvjvw\") pod \"watcher-operator-controller-manager-6c4d75f7f9-jwrd2\" (UID: \"919747b8-a031-4654-999f-3c3928f981b4\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.023979 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.046641 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.054427 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx"] Mar 13 14:17:04 crc kubenswrapper[4898]: W0313 14:17:04.059132 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d88a5d2_a852_409e_b4bd_939d1c2b9090.slice/crio-793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21 WatchSource:0}: Error finding container 793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21: Status 404 returned error can't find the container with id 793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21 Mar 13 14:17:04 crc kubenswrapper[4898]: W0313 14:17:04.068486 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c955ebc_98fd_4921_9923_6151a50e8eec.slice/crio-c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3 WatchSource:0}: Error finding container c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3: Status 404 returned error can't find the container with id c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3 Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.077715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfbv\" (UniqueName: \"kubernetes.io/projected/7b9c0413-5558-43c4-805b-7f035fded9b4-kube-api-access-zhfbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-82gtc\" (UID: \"7b9c0413-5558-43c4-805b-7f035fded9b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.077992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbxx\" (UniqueName: \"kubernetes.io/projected/3a26728d-85c2-465c-bce4-c74045ea9e0d-kube-api-access-jdbxx\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.078089 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.078298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078562 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078668 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.578653863 +0000 UTC m=+1259.580242102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078832 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.078883 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:04.578867628 +0000 UTC m=+1259.580455867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.087884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.111078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbxx\" (UniqueName: \"kubernetes.io/projected/3a26728d-85c2-465c-bce4-c74045ea9e0d-kube-api-access-jdbxx\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.180466 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfbv\" (UniqueName: \"kubernetes.io/projected/7b9c0413-5558-43c4-805b-7f035fded9b4-kube-api-access-zhfbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-82gtc\" (UID: \"7b9c0413-5558-43c4-805b-7f035fded9b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.180643 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.180926 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.181022 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:05.181001893 +0000 UTC m=+1260.182590132 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.208284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfbv\" (UniqueName: \"kubernetes.io/projected/7b9c0413-5558-43c4-805b-7f035fded9b4-kube-api-access-zhfbv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-82gtc\" (UID: \"7b9c0413-5558-43c4-805b-7f035fded9b4\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.268164 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.587813 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588212 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.588259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588332 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:05.588304317 +0000 UTC m=+1260.589892556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588391 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.588462 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:05.58844598 +0000 UTC m=+1260.590034319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.690823 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.691015 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: E0313 14:17:04.691099 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:06.691078577 +0000 UTC m=+1261.692666816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.692472 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.732734 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.745029 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl"] Mar 13 14:17:04 crc kubenswrapper[4898]: W0313 14:17:04.750237 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45efd8ce_26db_4511_bd88_2e7467d02bbb.slice/crio-728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d WatchSource:0}: Error finding container 728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d: Status 404 returned error can't find the container with id 728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.753830 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.760059 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-gtlps"] Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.821516 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" event={"ID":"fb7b2f97-fca8-41d2-9be7-d40fac94c171","Type":"ContainerStarted","Data":"14b49b36578e5baad4dcc89f37c936b6990ccc3cee155f4075ee27ce85de2cef"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.822594 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" event={"ID":"3c955ebc-98fd-4921-9923-6151a50e8eec","Type":"ContainerStarted","Data":"c7cc7671bd8023d7055244f8253f5d7396c3c1891269cc38baa7d6692d48b5a3"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.823424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" event={"ID":"ea0ad033-9a48-4e42-a237-f27cacf03adc","Type":"ContainerStarted","Data":"9d98915ef6bf8115270995879993f6b49b8ed969160205e354e2840e366f0fa7"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.829741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerStarted","Data":"b89241a610af52b17b7373b99bdbd766d6b800dcf44a04d41a8ec184f323a9e8"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.830856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" event={"ID":"0d88a5d2-a852-409e-b4bd-939d1c2b9090","Type":"ContainerStarted","Data":"793122426ae30c4ff049dfb616beca97752fa5b76927a883ab6fcfb0469dcf21"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.833095 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerStarted","Data":"0f4f488da805040bcc6927802f79c7733f3f68ce78f0f8dfc9c29899f39de99c"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.837167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" event={"ID":"45efd8ce-26db-4511-bd88-2e7467d02bbb","Type":"ContainerStarted","Data":"728f77daeb1e6f5737552fe481b2f6f36d274a3ad518796f0d7bf774913f038d"} Mar 13 14:17:04 crc kubenswrapper[4898]: I0313 14:17:04.905842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.056342 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.064671 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.204436 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.204702 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.204873 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:07.204857859 +0000 UTC m=+1262.206446098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.583124 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.599211 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.613566 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.613656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.613881 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.613948 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:07.613932979 +0000 UTC m=+1262.615521208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.614244 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.614273 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:07.614266298 +0000 UTC m=+1262.615854537 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.614846 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.625477 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh"] Mar 13 14:17:05 crc kubenswrapper[4898]: W0313 14:17:05.638591 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3795a7_363f_4637_afe2_77cb77248f9a.slice/crio-ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480 WatchSource:0}: Error finding container ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480: Status 404 returned error can't find the container with id ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480 Mar 13 14:17:05 crc kubenswrapper[4898]: W0313 14:17:05.642810 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7c657b_a701_41fe_9b23_d5bba3302c4f.slice/crio-437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6 WatchSource:0}: Error finding container 437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6: Status 404 returned error can't find the container with id 437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6 Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.650638 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh"] Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.665237 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvjvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-jwrd2_openstack-operators(919747b8-a031-4654-999f-3c3928f981b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.666854 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.668871 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52gjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-s2rdh_openstack-operators(d29ce3ee-3d5a-4801-abf9-dfef5b641a74): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.670362 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.671698 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.705111 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.726294 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.738909 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2"] Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.756173 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zhfbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-82gtc_openstack-operators(7b9c0413-5558-43c4-805b-7f035fded9b4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.757391 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.771945 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc"] Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.864693 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" event={"ID":"d24bb749-0b71-456b-80e4-fdf6dd23ba30","Type":"ContainerStarted","Data":"52f3b126ec515a3b99aa74016189fe85b1a3830ec642b33f3af093d4ce2d1dd0"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.867903 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" event={"ID":"0d7c657b-a701-41fe-9b23-d5bba3302c4f","Type":"ContainerStarted","Data":"437e96d1fa812859f5d447beb3509139da83bfb5b6ddd54dc43c69a3b2c8f4e6"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.874960 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerStarted","Data":"2485923817c4aa5b2796763c3dead67f36e78a34bcf659b5e705d6ec6d42c8a9"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.876246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" event={"ID":"d71982c0-a3d0-4da8-84cd-7494301f589f","Type":"ContainerStarted","Data":"798dad54b98acf66ff986a32d06d487085a6511f7ca8d4ffa80ba98da7f2b774"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.878472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" event={"ID":"919747b8-a031-4654-999f-3c3928f981b4","Type":"ContainerStarted","Data":"4a2fc19a56feb3383ce71da15b91814fc4fb03e0ff94318935a8623ac8f716c3"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.879298 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" event={"ID":"52959483-daae-423a-a3bf-8e3fa7810074","Type":"ContainerStarted","Data":"0501e5bb2341712296e0e6d55c1011023e3dae7005e2b71f056e627bf94f85d8"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.880236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" event={"ID":"9ff6f89a-7110-42fb-96b9-8611f280bebe","Type":"ContainerStarted","Data":"dcf23a3a877f3bc4fa451bfcbbcab4b79c44e506a6525ba6b9de798d32828221"} Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.880512 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.882461 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" event={"ID":"da3795a7-363f-4637-afe2-77cb77248f9a","Type":"ContainerStarted","Data":"ca4ecd792b4282a6e252f8ba184b93785f2c8932c7847b41ed1d68d038d89480"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.892971 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerStarted","Data":"61ee13192b8406ac5d2138d04ae263f68dc8e87d024f1c72264b4c698a929098"} Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.899161 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.907310 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" event={"ID":"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1","Type":"ContainerStarted","Data":"1eb6a5434eeb176850a728d7ccb5a75898f2d1470a7a23104537baefca873452"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.914679 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" event={"ID":"1df4a7d6-b0c2-4b00-b591-1a612bd319b6","Type":"ContainerStarted","Data":"68617652a567261ca08445a27c0e0b16c6c3ae01dec4be2a59e6f76dd4ba045b"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.933383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" event={"ID":"66a86c31-9ff3-439a-a0f8-96c981014b6f","Type":"ContainerStarted","Data":"b80f998b1054ca04bf2753585897e957674d0da0803a2bdb9e0363c50019795b"} Mar 13 14:17:05 crc kubenswrapper[4898]: I0313 14:17:05.945446 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" event={"ID":"7b9c0413-5558-43c4-805b-7f035fded9b4","Type":"ContainerStarted","Data":"31208be09479c4b377365ac6f5a4e48f629b1c363f99d79e2adea0deb4658260"} Mar 13 14:17:05 crc kubenswrapper[4898]: E0313 14:17:05.953974 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:06 crc kubenswrapper[4898]: I0313 14:17:06.750864 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.751039 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.751113 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:10.75109632 +0000 UTC m=+1265.752684559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.961458 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.961470 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:06 crc kubenswrapper[4898]: E0313 14:17:06.961548 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:07 crc kubenswrapper[4898]: I0313 14:17:07.260272 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.260648 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.260774 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:11.260749014 +0000 UTC m=+1266.262337273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: I0313 14:17:07.668570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:07 crc kubenswrapper[4898]: I0313 14:17:07.668652 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.668738 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.668828 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:11.668810288 +0000 UTC m=+1266.670398527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.668982 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:07 crc kubenswrapper[4898]: E0313 14:17:07.669063 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:11.669044914 +0000 UTC m=+1266.670633153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:10 crc kubenswrapper[4898]: I0313 14:17:10.825593 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:10 crc kubenswrapper[4898]: E0313 14:17:10.825772 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:10 crc kubenswrapper[4898]: E0313 14:17:10.826040 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:18.82601645 +0000 UTC m=+1273.827604699 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: I0313 14:17:11.347936 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.348107 4898 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.348179 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert podName:0ab852e1-fd26-4f76-b758-77896f8e236b nodeName:}" failed. No retries permitted until 2026-03-13 14:17:19.34816192 +0000 UTC m=+1274.349750149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" (UID: "0ab852e1-fd26-4f76-b758-77896f8e236b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: I0313 14:17:11.755970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:11 crc kubenswrapper[4898]: I0313 14:17:11.756308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.756120 4898 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.756626 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:19.756600314 +0000 UTC m=+1274.758188593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "metrics-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.756429 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:11 crc kubenswrapper[4898]: E0313 14:17:11.757042 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:19.757025285 +0000 UTC m=+1274.758613544 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:18 crc kubenswrapper[4898]: I0313 14:17:18.902441 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:18 crc kubenswrapper[4898]: E0313 14:17:18.903079 4898 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:18 crc kubenswrapper[4898]: E0313 14:17:18.904151 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert podName:c35de09d-7f21-47d3-aac5-a26b15b0a496 nodeName:}" failed. No retries permitted until 2026-03-13 14:17:34.904128249 +0000 UTC m=+1289.905716488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert") pod "infra-operator-controller-manager-54dc5b8f8d-8kcsw" (UID: "c35de09d-7f21-47d3-aac5-a26b15b0a496") : secret "infra-operator-webhook-server-cert" not found Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.413512 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.418858 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ab852e1-fd26-4f76-b758-77896f8e236b-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv\" (UID: \"0ab852e1-fd26-4f76-b758-77896f8e236b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.519649 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-x5ln5" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.527684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.715017 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.715341 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jbcwc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-6d9d6b584d-jngrl_openstack-operators(a80d01d5-0201-4b2e-974c-ac5b42ac8df4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.716493 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.821222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.821475 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.821749 4898 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 14:17:19 crc kubenswrapper[4898]: E0313 14:17:19.821881 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs podName:3a26728d-85c2-465c-bce4-c74045ea9e0d nodeName:}" failed. No retries permitted until 2026-03-13 14:17:35.821863977 +0000 UTC m=+1290.823452226 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs") pod "openstack-operator-controller-manager-5f7dc44db6-9nsrh" (UID: "3a26728d-85c2-465c-bce4-c74045ea9e0d") : secret "webhook-server-cert" not found Mar 13 14:17:19 crc kubenswrapper[4898]: I0313 14:17:19.829821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-metrics-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.109527 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:d9bffb59bb7f9f0a6cb103c3986fd2c1bdb13ce6349c39427a690858cbd754d6\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.652862 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.653349 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rz8wp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-smdkt_openstack-operators(19a0f4de-5258-4f2b-9587-71293459378e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:20 crc kubenswrapper[4898]: E0313 14:17:20.654518 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" Mar 13 14:17:21 crc kubenswrapper[4898]: E0313 14:17:21.120785 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" Mar 13 14:17:23 crc kubenswrapper[4898]: E0313 14:17:23.404271 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9" Mar 13 14:17:23 crc kubenswrapper[4898]: E0313 14:17:23.404834 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4l2jg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-984cd4dcf-p9d5v_openstack-operators(0d88a5d2-a852-409e-b4bd-939d1c2b9090): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:23 crc kubenswrapper[4898]: E0313 14:17:23.406135 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.148757 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7c0da25380c91ffd1940d75eaa71b6842a6a4cf4056e62d6b0d237897b74e4d9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.246581 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.246800 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q57rr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc_openstack-operators(ba56f415-73d5-4301-a25d-0e5d1ba4e3b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.248038 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.815076 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.817160 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qb2ck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-wdmrh_openstack-operators(da3795a7-363f-4637-afe2-77cb77248f9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:24 crc kubenswrapper[4898]: E0313 14:17:24.818412 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" Mar 13 14:17:25 crc kubenswrapper[4898]: E0313 14:17:25.157810 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" Mar 13 14:17:25 crc kubenswrapper[4898]: E0313 14:17:25.157881 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" Mar 13 14:17:29 crc kubenswrapper[4898]: E0313 14:17:29.657395 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6" Mar 13 14:17:29 crc kubenswrapper[4898]: E0313 14:17:29.659042 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8xlhk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-77b6666d85-tqp4b_openstack-operators(ea0ad033-9a48-4e42-a237-f27cacf03adc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:29 crc kubenswrapper[4898]: E0313 14:17:29.660367 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.175003 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.175369 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.175528 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ggn24,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5b9fbd87f-s2k96_openstack-operators(9ff6f89a-7110-42fb-96b9-8611f280bebe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.176801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.203892 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:6c9aef12f50be0b974f5e35b0d69303e7f7b95e6db5d41bcdb2d9d1100e921a6\\\"\"" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" Mar 13 14:17:30 crc kubenswrapper[4898]: E0313 14:17:30.204343 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.36:5001/openstack-k8s-operators/telemetry-operator:8ccfcdb23140e93b912bb23903a7d6fafb754e30\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" Mar 13 14:17:31 crc kubenswrapper[4898]: E0313 14:17:31.834811 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165" Mar 13 14:17:31 crc kubenswrapper[4898]: E0313 14:17:31.837193 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xm7pb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-d47688694-gtlps_openstack-operators(45efd8ce-26db-4511-bd88-2e7467d02bbb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:31 crc kubenswrapper[4898]: E0313 14:17:31.839742 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.219976 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.338386 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.338596 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wnvxx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-57b484b4df-z2gd2_openstack-operators(1df4a7d6-b0c2-4b00-b591-1a612bd319b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:32 crc kubenswrapper[4898]: E0313 14:17:32.339846 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" Mar 13 14:17:33 crc kubenswrapper[4898]: E0313 14:17:33.233729 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:dd62e104225ea255af5a32828af4c21e1dfb50fbdf35cd41d07d1326f9017a40\\\"\"" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.258286 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.258865 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbspp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-f2t6t_openstack-operators(66a86c31-9ff3-439a-a0f8-96c981014b6f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.260093 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.763859 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.764081 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkdwz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-njsvh_openstack-operators(0d7c657b-a701-41fe-9b23-d5bba3302c4f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:34 crc kubenswrapper[4898]: E0313 14:17:34.766083 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" Mar 13 14:17:34 crc kubenswrapper[4898]: I0313 14:17:34.922287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:34 crc kubenswrapper[4898]: I0313 14:17:34.931911 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c35de09d-7f21-47d3-aac5-a26b15b0a496-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-8kcsw\" (UID: \"c35de09d-7f21-47d3-aac5-a26b15b0a496\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.086136 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-j4b7h" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.095391 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.251433 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.252713 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.350813 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.351041 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btrh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-ntlw6_openstack-operators(d71982c0-a3d0-4da8-84cd-7494301f589f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:35 crc kubenswrapper[4898]: E0313 14:17:35.352470 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.835344 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.841553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3a26728d-85c2-465c-bce4-c74045ea9e0d-webhook-certs\") pod \"openstack-operator-controller-manager-5f7dc44db6-9nsrh\" (UID: \"3a26728d-85c2-465c-bce4-c74045ea9e0d\") " pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.979460 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-nrmkf" Mar 13 14:17:35 crc kubenswrapper[4898]: I0313 14:17:35.987874 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:36 crc kubenswrapper[4898]: E0313 14:17:36.257848 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" Mar 13 14:17:38 crc kubenswrapper[4898]: E0313 14:17:38.281317 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 14:17:38 crc kubenswrapper[4898]: E0313 14:17:38.281505 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c59lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-s5zh6_openstack-operators(d24bb749-0b71-456b-80e4-fdf6dd23ba30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:38 crc kubenswrapper[4898]: E0313 14:17:38.282834 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.285480 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.765760 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.765957 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xvjvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-jwrd2_openstack-operators(919747b8-a031-4654-999f-3c3928f981b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:39 crc kubenswrapper[4898]: E0313 14:17:39.767159 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:41 crc kubenswrapper[4898]: E0313 14:17:41.851343 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571" Mar 13 14:17:41 crc kubenswrapper[4898]: E0313 14:17:41.852016 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-52gjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5f4f55cb5c-s2rdh_openstack-operators(d29ce3ee-3d5a-4801-abf9-dfef5b641a74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:41 crc kubenswrapper[4898]: E0313 14:17:41.853868 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.014659 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.015132 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wbj2m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-mr4wv_openstack-operators(52959483-daae-423a-a3bf-8e3fa7810074): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.016479 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podUID="52959483-daae-423a-a3bf-8e3fa7810074" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.335832 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podUID="52959483-daae-423a-a3bf-8e3fa7810074" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.412330 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.412512 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zhfbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-82gtc_openstack-operators(7b9c0413-5558-43c4-805b-7f035fded9b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:17:44 crc kubenswrapper[4898]: E0313 14:17:44.414120 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:44 crc kubenswrapper[4898]: I0313 14:17:44.906659 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv"] Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.000570 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw"] Mar 13 14:17:45 crc kubenswrapper[4898]: W0313 14:17:45.011046 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc35de09d_7f21_47d3_aac5_a26b15b0a496.slice/crio-f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471 WatchSource:0}: Error finding container f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471: Status 404 returned error can't find the container with id f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471 Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.043318 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh"] Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.329667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" event={"ID":"0d88a5d2-a852-409e-b4bd-939d1c2b9090","Type":"ContainerStarted","Data":"9476f49672453c043034bf2b4fc2b059e64c5120f767076fe3286fc76332b438"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.330150 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.332888 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" event={"ID":"c35de09d-7f21-47d3-aac5-a26b15b0a496","Type":"ContainerStarted","Data":"f59d5a970d2c0db00459d702f37b694630a1f815ef58dbe10d228ce86e6ff471"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.334593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" event={"ID":"ea0ad033-9a48-4e42-a237-f27cacf03adc","Type":"ContainerStarted","Data":"4ad4efd883403cf99f4ae05d9e5069d476891625501a47b507ecb25aaa59279a"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.335344 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.336557 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" event={"ID":"ba56f415-73d5-4301-a25d-0e5d1ba4e3b1","Type":"ContainerStarted","Data":"7779ff9639d753fc0b8d37fac6f107546206f5932652761cf443e5906d099ade"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.336933 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.338063 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerStarted","Data":"cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.338445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.339588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" event={"ID":"1df4a7d6-b0c2-4b00-b591-1a612bd319b6","Type":"ContainerStarted","Data":"90c2cd03b5da2efe2b2140a0201f7ff9d1487c261ab13f06a8199c5a9e03c947"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.339974 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.340867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" event={"ID":"3a26728d-85c2-465c-bce4-c74045ea9e0d","Type":"ContainerStarted","Data":"3cc5304750c080a6debdf409cea68b446d3885f1b82989fc52ec8c5c354595f6"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.342031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerStarted","Data":"490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.342419 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.344307 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerStarted","Data":"d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.344462 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.353958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerStarted","Data":"d5d0c797c988d2fbc23ad907fce7b101168cdaa8057bb8670fd809bd5d1557bd"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.357506 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" event={"ID":"fb7b2f97-fca8-41d2-9be7-d40fac94c171","Type":"ContainerStarted","Data":"9cf8061003f7f4ee16b4b5dd5a9b44b3bd2079d20345c208aa37b8286b144c96"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.357617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.359029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" event={"ID":"3c955ebc-98fd-4921-9923-6151a50e8eec","Type":"ContainerStarted","Data":"ac8819fe2b0b8e3b70c3aac9e58b33895d2a1b842f4a33d3cf71e845cfa1e0d7"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.359658 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.365367 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" event={"ID":"da3795a7-363f-4637-afe2-77cb77248f9a","Type":"ContainerStarted","Data":"39244065c035a66d2feea4e8ced4cdbb99d19f12dda86ff67f0baef554cd9c9b"} Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.366688 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.405947 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podStartSLOduration=3.092980187 podStartE2EDuration="43.405929101s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.108291376 +0000 UTC m=+1259.109879615" lastFinishedPulling="2026-03-13 14:17:44.42124029 +0000 UTC m=+1299.422828529" observedRunningTime="2026-03-13 14:17:45.398556379 +0000 UTC m=+1300.400144638" watchObservedRunningTime="2026-03-13 14:17:45.405929101 +0000 UTC m=+1300.407517340" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.465900 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podStartSLOduration=4.51132066 podStartE2EDuration="43.46587655s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.630521002 +0000 UTC m=+1260.632109241" lastFinishedPulling="2026-03-13 14:17:44.585076892 +0000 UTC m=+1299.586665131" observedRunningTime="2026-03-13 14:17:45.459655079 +0000 UTC m=+1300.461243338" watchObservedRunningTime="2026-03-13 14:17:45.46587655 +0000 UTC m=+1300.467464789" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.515881 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podStartSLOduration=8.447947367 podStartE2EDuration="43.515862801s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.700442582 +0000 UTC m=+1259.702030821" lastFinishedPulling="2026-03-13 14:17:39.768358016 +0000 UTC m=+1294.769946255" observedRunningTime="2026-03-13 14:17:45.512552984 +0000 UTC m=+1300.514141243" watchObservedRunningTime="2026-03-13 14:17:45.515862801 +0000 UTC m=+1300.517451040" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.584553 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podStartSLOduration=4.077195436 podStartE2EDuration="43.584532577s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.913816377 +0000 UTC m=+1259.915404616" lastFinishedPulling="2026-03-13 14:17:44.421153518 +0000 UTC m=+1299.422741757" observedRunningTime="2026-03-13 14:17:45.568747136 +0000 UTC m=+1300.570335395" watchObservedRunningTime="2026-03-13 14:17:45.584532577 +0000 UTC m=+1300.586120816" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.612358 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podStartSLOduration=4.84424075 podStartE2EDuration="43.61234252s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.640733718 +0000 UTC m=+1260.642321957" lastFinishedPulling="2026-03-13 14:17:44.408835488 +0000 UTC m=+1299.410423727" observedRunningTime="2026-03-13 14:17:45.608196422 +0000 UTC m=+1300.609784661" watchObservedRunningTime="2026-03-13 14:17:45.61234252 +0000 UTC m=+1300.613930749" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.739623 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podStartSLOduration=8.071543881 podStartE2EDuration="43.73960595s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.091304433 +0000 UTC m=+1259.092892672" lastFinishedPulling="2026-03-13 14:17:39.759366502 +0000 UTC m=+1294.760954741" observedRunningTime="2026-03-13 14:17:45.719362304 +0000 UTC m=+1300.720950553" watchObservedRunningTime="2026-03-13 14:17:45.73960595 +0000 UTC m=+1300.741194189" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.742526 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podStartSLOduration=4.481504389 podStartE2EDuration="43.742515266s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.733650518 +0000 UTC m=+1259.735238757" lastFinishedPulling="2026-03-13 14:17:43.994661395 +0000 UTC m=+1298.996249634" observedRunningTime="2026-03-13 14:17:45.679503867 +0000 UTC m=+1300.681092116" watchObservedRunningTime="2026-03-13 14:17:45.742515266 +0000 UTC m=+1300.744103505" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.770118 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podStartSLOduration=3.856631913 podStartE2EDuration="43.770099693s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.704510828 +0000 UTC m=+1259.706099067" lastFinishedPulling="2026-03-13 14:17:44.617978608 +0000 UTC m=+1299.619566847" observedRunningTime="2026-03-13 14:17:45.768844101 +0000 UTC m=+1300.770432340" watchObservedRunningTime="2026-03-13 14:17:45.770099693 +0000 UTC m=+1300.771687932" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.801973 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podStartSLOduration=4.068857375 podStartE2EDuration="42.801950002s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.664807466 +0000 UTC m=+1260.666395695" lastFinishedPulling="2026-03-13 14:17:44.397900083 +0000 UTC m=+1299.399488322" observedRunningTime="2026-03-13 14:17:45.800605677 +0000 UTC m=+1300.802193926" watchObservedRunningTime="2026-03-13 14:17:45.801950002 +0000 UTC m=+1300.803538261" Mar 13 14:17:45 crc kubenswrapper[4898]: I0313 14:17:45.827647 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podStartSLOduration=4.592013144 podStartE2EDuration="43.827624479s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.744103191 +0000 UTC m=+1259.745691440" lastFinishedPulling="2026-03-13 14:17:43.979714526 +0000 UTC m=+1298.981302775" observedRunningTime="2026-03-13 14:17:45.822844105 +0000 UTC m=+1300.824432354" watchObservedRunningTime="2026-03-13 14:17:45.827624479 +0000 UTC m=+1300.829212718" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.393688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" event={"ID":"9ff6f89a-7110-42fb-96b9-8611f280bebe","Type":"ContainerStarted","Data":"0633feb076d7a92b8ba593135493cb20da0d09f9e8caebe97d98583e73b5df27"} Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.396000 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.401806 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" event={"ID":"45efd8ce-26db-4511-bd88-2e7467d02bbb","Type":"ContainerStarted","Data":"668a9795062b8fa71ac40f7ae8d277ef40b26eb48113263deda3a652735e7086"} Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.402861 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.405662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" event={"ID":"3a26728d-85c2-465c-bce4-c74045ea9e0d","Type":"ContainerStarted","Data":"d4370cd3a1919d89fe936f3d3e40cf45a946359d1caa65196258cd9057775ac2"} Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.419963 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podStartSLOduration=3.181471312 podStartE2EDuration="43.419947686s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.633324925 +0000 UTC m=+1260.634913174" lastFinishedPulling="2026-03-13 14:17:45.871801309 +0000 UTC m=+1300.873389548" observedRunningTime="2026-03-13 14:17:46.419411332 +0000 UTC m=+1301.420999581" watchObservedRunningTime="2026-03-13 14:17:46.419947686 +0000 UTC m=+1301.421535925" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.454748 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podStartSLOduration=43.454732521 podStartE2EDuration="43.454732521s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:17:46.452440451 +0000 UTC m=+1301.454028690" watchObservedRunningTime="2026-03-13 14:17:46.454732521 +0000 UTC m=+1301.456320760" Mar 13 14:17:46 crc kubenswrapper[4898]: I0313 14:17:46.482816 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podStartSLOduration=3.5852143 podStartE2EDuration="44.482796111s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:04.754298367 +0000 UTC m=+1259.755886596" lastFinishedPulling="2026-03-13 14:17:45.651880178 +0000 UTC m=+1300.653468407" observedRunningTime="2026-03-13 14:17:46.479775872 +0000 UTC m=+1301.481364121" watchObservedRunningTime="2026-03-13 14:17:46.482796111 +0000 UTC m=+1301.484384360" Mar 13 14:17:47 crc kubenswrapper[4898]: I0313 14:17:47.412708 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.427797 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" event={"ID":"0d7c657b-a701-41fe-9b23-d5bba3302c4f","Type":"ContainerStarted","Data":"e9ef8c7a7570b9f18555eb7a145b577cbda97d59d879abac51522a7ef7563575"} Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.428419 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.429827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerStarted","Data":"5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69"} Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.429933 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.431189 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" event={"ID":"c35de09d-7f21-47d3-aac5-a26b15b0a496","Type":"ContainerStarted","Data":"439b97cf887edefb45f2da268f25ca7e59b0c9383815a8f46efa2d00e888c19e"} Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.431356 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.450618 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podStartSLOduration=4.109701948 podStartE2EDuration="47.450599532s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.663147733 +0000 UTC m=+1260.664735972" lastFinishedPulling="2026-03-13 14:17:49.004045317 +0000 UTC m=+1304.005633556" observedRunningTime="2026-03-13 14:17:49.446765762 +0000 UTC m=+1304.448354011" watchObservedRunningTime="2026-03-13 14:17:49.450599532 +0000 UTC m=+1304.452187771" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.473047 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podStartSLOduration=43.479212738 podStartE2EDuration="47.473021555s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:45.012846388 +0000 UTC m=+1300.014434627" lastFinishedPulling="2026-03-13 14:17:49.006655195 +0000 UTC m=+1304.008243444" observedRunningTime="2026-03-13 14:17:49.466629799 +0000 UTC m=+1304.468218058" watchObservedRunningTime="2026-03-13 14:17:49.473021555 +0000 UTC m=+1304.474609794" Mar 13 14:17:49 crc kubenswrapper[4898]: I0313 14:17:49.506002 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podStartSLOduration=43.425376398 podStartE2EDuration="47.505971752s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:44.92179555 +0000 UTC m=+1299.923383789" lastFinishedPulling="2026-03-13 14:17:49.002390894 +0000 UTC m=+1304.003979143" observedRunningTime="2026-03-13 14:17:49.495821278 +0000 UTC m=+1304.497409527" watchObservedRunningTime="2026-03-13 14:17:49.505971752 +0000 UTC m=+1304.507559991" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.450276 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" event={"ID":"d24bb749-0b71-456b-80e4-fdf6dd23ba30","Type":"ContainerStarted","Data":"0b4e7587bc426e2c72edd1a3cd1c6a6f006317555be494e20de07b4d792e8cb0"} Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.451106 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.453506 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" event={"ID":"66a86c31-9ff3-439a-a0f8-96c981014b6f","Type":"ContainerStarted","Data":"88fb60edbde24ac17c310036220e101cce0c1eccdce27094156356847f85f3fc"} Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.453723 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.455208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" event={"ID":"d71982c0-a3d0-4da8-84cd-7494301f589f","Type":"ContainerStarted","Data":"0e4e20e0e7044647f870045b4f25b0cd67f173d8704e1fb7a86a40357a37793b"} Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.455415 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.472894 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podStartSLOduration=3.374802329 podStartE2EDuration="49.472875352s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.066081199 +0000 UTC m=+1260.067669438" lastFinishedPulling="2026-03-13 14:17:51.164154222 +0000 UTC m=+1306.165742461" observedRunningTime="2026-03-13 14:17:51.471713551 +0000 UTC m=+1306.473301800" watchObservedRunningTime="2026-03-13 14:17:51.472875352 +0000 UTC m=+1306.474463591" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.492316 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podStartSLOduration=3.393455715 podStartE2EDuration="49.492297007s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.066407338 +0000 UTC m=+1260.067995577" lastFinishedPulling="2026-03-13 14:17:51.16524862 +0000 UTC m=+1306.166836869" observedRunningTime="2026-03-13 14:17:51.488370835 +0000 UTC m=+1306.489959094" watchObservedRunningTime="2026-03-13 14:17:51.492297007 +0000 UTC m=+1306.493885246" Mar 13 14:17:51 crc kubenswrapper[4898]: I0313 14:17:51.502131 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podStartSLOduration=3.970020268 podStartE2EDuration="49.502109012s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.704705837 +0000 UTC m=+1260.706294076" lastFinishedPulling="2026-03-13 14:17:51.236794581 +0000 UTC m=+1306.238382820" observedRunningTime="2026-03-13 14:17:51.500704575 +0000 UTC m=+1306.502292824" watchObservedRunningTime="2026-03-13 14:17:51.502109012 +0000 UTC m=+1306.503697251" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.037309 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.038041 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.042850 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.100264 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.169442 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.214748 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.461670 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.497169 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.539568 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" Mar 13 14:17:53 crc kubenswrapper[4898]: E0313 14:17:53.741676 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" Mar 13 14:17:53 crc kubenswrapper[4898]: I0313 14:17:53.932464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" Mar 13 14:17:54 crc kubenswrapper[4898]: I0313 14:17:54.005853 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" Mar 13 14:17:54 crc kubenswrapper[4898]: I0313 14:17:54.027668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 14:17:54 crc kubenswrapper[4898]: E0313 14:17:54.743146 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:18fe6f2f0be7e736db86ff2d600af12a753e14b0a03232ce4f03629a89905571\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" Mar 13 14:17:55 crc kubenswrapper[4898]: I0313 14:17:55.102232 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" Mar 13 14:17:55 crc kubenswrapper[4898]: I0313 14:17:55.995117 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" Mar 13 14:17:57 crc kubenswrapper[4898]: I0313 14:17:57.504119 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" event={"ID":"52959483-daae-423a-a3bf-8e3fa7810074","Type":"ContainerStarted","Data":"ce8ee2b018c365558145424246baad3faea16352a1e7f13624ea91bdda3e4f6f"} Mar 13 14:17:57 crc kubenswrapper[4898]: I0313 14:17:57.504598 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:17:57 crc kubenswrapper[4898]: I0313 14:17:57.524381 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podStartSLOduration=3.974593632 podStartE2EDuration="55.524361969s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.638536501 +0000 UTC m=+1260.640124740" lastFinishedPulling="2026-03-13 14:17:57.188304838 +0000 UTC m=+1312.189893077" observedRunningTime="2026-03-13 14:17:57.518837685 +0000 UTC m=+1312.520425964" watchObservedRunningTime="2026-03-13 14:17:57.524361969 +0000 UTC m=+1312.525950208" Mar 13 14:17:57 crc kubenswrapper[4898]: E0313 14:17:57.742077 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podUID="7b9c0413-5558-43c4-805b-7f035fded9b4" Mar 13 14:17:59 crc kubenswrapper[4898]: I0313 14:17:59.537267 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.140838 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.142214 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.145575 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.146015 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.146200 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.151099 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.207119 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"auto-csr-approver-29556858-vdbkv\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.308163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"auto-csr-approver-29556858-vdbkv\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.336182 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"auto-csr-approver-29556858-vdbkv\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.461882 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:00 crc kubenswrapper[4898]: I0313 14:18:00.920672 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:18:01 crc kubenswrapper[4898]: I0313 14:18:01.546845 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerStarted","Data":"265ce71be2ddaa7e76a9da2460c089ad22f9323bae84430d6127920e01c0a1f4"} Mar 13 14:18:02 crc kubenswrapper[4898]: I0313 14:18:02.555183 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerStarted","Data":"c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4"} Mar 13 14:18:02 crc kubenswrapper[4898]: I0313 14:18:02.567550 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" podStartSLOduration=1.333563935 podStartE2EDuration="2.567529161s" podCreationTimestamp="2026-03-13 14:18:00 +0000 UTC" firstStartedPulling="2026-03-13 14:18:00.92582374 +0000 UTC m=+1315.927411979" lastFinishedPulling="2026-03-13 14:18:02.159788956 +0000 UTC m=+1317.161377205" observedRunningTime="2026-03-13 14:18:02.565790246 +0000 UTC m=+1317.567378485" watchObservedRunningTime="2026-03-13 14:18:02.567529161 +0000 UTC m=+1317.569117400" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.507829 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.546061 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.553727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.566999 4898 generic.go:334] "Generic (PLEG): container finished" podID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerID="c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4" exitCode=0 Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.567043 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerDied","Data":"c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4"} Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.956694 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" Mar 13 14:18:03 crc kubenswrapper[4898]: I0313 14:18:03.982360 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" Mar 13 14:18:04 crc kubenswrapper[4898]: I0313 14:18:04.911158 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.094128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") pod \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\" (UID: \"1b0610af-1f13-4f43-9249-8d50a0dcbc14\") " Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.100309 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn" (OuterVolumeSpecName: "kube-api-access-qgkbn") pod "1b0610af-1f13-4f43-9249-8d50a0dcbc14" (UID: "1b0610af-1f13-4f43-9249-8d50a0dcbc14"). InnerVolumeSpecName "kube-api-access-qgkbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.202075 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgkbn\" (UniqueName: \"kubernetes.io/projected/1b0610af-1f13-4f43-9249-8d50a0dcbc14-kube-api-access-qgkbn\") on node \"crc\" DevicePath \"\"" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.584705 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" event={"ID":"1b0610af-1f13-4f43-9249-8d50a0dcbc14","Type":"ContainerDied","Data":"265ce71be2ddaa7e76a9da2460c089ad22f9323bae84430d6127920e01c0a1f4"} Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.584757 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="265ce71be2ddaa7e76a9da2460c089ad22f9323bae84430d6127920e01c0a1f4" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.584786 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556858-vdbkv" Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.636961 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.644371 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556852-z8l5q"] Mar 13 14:18:05 crc kubenswrapper[4898]: I0313 14:18:05.751470 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da054881-deef-4491-9685-5f35ee9fc45f" path="/var/lib/kubelet/pods/da054881-deef-4491-9685-5f35ee9fc45f/volumes" Mar 13 14:18:07 crc kubenswrapper[4898]: I0313 14:18:07.602417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" event={"ID":"919747b8-a031-4654-999f-3c3928f981b4","Type":"ContainerStarted","Data":"2a63b3120bf0b6fde648902ea96aff3bdb84041187c2d2f323a87c8815000e30"} Mar 13 14:18:07 crc kubenswrapper[4898]: I0313 14:18:07.603069 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:18:07 crc kubenswrapper[4898]: I0313 14:18:07.619728 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podStartSLOduration=3.099097083 podStartE2EDuration="1m4.619706527s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.665048103 +0000 UTC m=+1260.666636332" lastFinishedPulling="2026-03-13 14:18:07.185657537 +0000 UTC m=+1322.187245776" observedRunningTime="2026-03-13 14:18:07.615508968 +0000 UTC m=+1322.617097247" watchObservedRunningTime="2026-03-13 14:18:07.619706527 +0000 UTC m=+1322.621294766" Mar 13 14:18:11 crc kubenswrapper[4898]: I0313 14:18:11.650395 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerStarted","Data":"f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d"} Mar 13 14:18:11 crc kubenswrapper[4898]: I0313 14:18:11.652025 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:18:11 crc kubenswrapper[4898]: I0313 14:18:11.675223 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podStartSLOduration=4.816303157 podStartE2EDuration="1m9.67520369s" podCreationTimestamp="2026-03-13 14:17:02 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.66877209 +0000 UTC m=+1260.670360329" lastFinishedPulling="2026-03-13 14:18:10.527672623 +0000 UTC m=+1325.529260862" observedRunningTime="2026-03-13 14:18:11.668086105 +0000 UTC m=+1326.669674354" watchObservedRunningTime="2026-03-13 14:18:11.67520369 +0000 UTC m=+1326.676791929" Mar 13 14:18:12 crc kubenswrapper[4898]: I0313 14:18:12.669101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" event={"ID":"7b9c0413-5558-43c4-805b-7f035fded9b4","Type":"ContainerStarted","Data":"e5c2d747b8dc9a1a0b04a2711ae93e1c3479dbb6eb71a1261f593abef3d31048"} Mar 13 14:18:12 crc kubenswrapper[4898]: I0313 14:18:12.687191 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-82gtc" podStartSLOduration=3.701077665 podStartE2EDuration="1m9.687169771s" podCreationTimestamp="2026-03-13 14:17:03 +0000 UTC" firstStartedPulling="2026-03-13 14:17:05.756015854 +0000 UTC m=+1260.757604093" lastFinishedPulling="2026-03-13 14:18:11.74210796 +0000 UTC m=+1326.743696199" observedRunningTime="2026-03-13 14:18:12.681804622 +0000 UTC m=+1327.683392871" watchObservedRunningTime="2026-03-13 14:18:12.687169771 +0000 UTC m=+1327.688758020" Mar 13 14:18:14 crc kubenswrapper[4898]: I0313 14:18:14.091380 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" Mar 13 14:18:19 crc kubenswrapper[4898]: I0313 14:18:19.134138 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:18:19 crc kubenswrapper[4898]: I0313 14:18:19.134736 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:18:23 crc kubenswrapper[4898]: I0313 14:18:23.884201 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 14:18:32 crc kubenswrapper[4898]: I0313 14:18:32.915589 4898 scope.go:117] "RemoveContainer" containerID="5010d4732869bc8d7e0532b7c193085ea8336efd3a5d0f8cd686f3caf758e4d9" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.399523 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:42 crc kubenswrapper[4898]: E0313 14:18:42.400535 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerName="oc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.400549 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerName="oc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.400782 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" containerName="oc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.401856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.404261 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-wwqwh" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.410295 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.410401 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.410641 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.411890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.412012 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.424759 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.465149 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.466921 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.469413 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.479968 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.513619 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.513790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.514836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.550792 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"dnsmasq-dns-675f4bcbfc-npnz4\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.616814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.616881 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.616954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.718609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.718917 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.719048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.719527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.720443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.725755 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.737566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"dnsmasq-dns-78dd6ddcc-5jhct\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:42 crc kubenswrapper[4898]: I0313 14:18:42.797770 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.212548 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:43 crc kubenswrapper[4898]: W0313 14:18:43.214543 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5005de8_b440_45e8_a1a7_7943f68bff2f.slice/crio-64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1 WatchSource:0}: Error finding container 64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1: Status 404 returned error can't find the container with id 64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1 Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.311037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:43 crc kubenswrapper[4898]: W0313 14:18:43.311613 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70dc5baf_6ae1_41b4_9454_8ff891570f8b.slice/crio-41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d WatchSource:0}: Error finding container 41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d: Status 404 returned error can't find the container with id 41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.987287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" event={"ID":"b5005de8-b440-45e8-a1a7-7943f68bff2f","Type":"ContainerStarted","Data":"64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1"} Mar 13 14:18:43 crc kubenswrapper[4898]: I0313 14:18:43.989015 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" event={"ID":"70dc5baf-6ae1-41b4-9454-8ff891570f8b","Type":"ContainerStarted","Data":"41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d"} Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.306275 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.355025 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.356597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.375480 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.385104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.385184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.385336 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.487706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.487775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.487808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.488889 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.488929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.512623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"dnsmasq-dns-5ccc8479f9-nmlp6\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.664301 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.683911 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.702007 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.703667 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.708839 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.793702 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.793753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.793919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.895525 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.895590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.895623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.897181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.897268 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:45 crc kubenswrapper[4898]: I0313 14:18:45.915476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"dnsmasq-dns-57d769cc4f-hk9w4\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.167478 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.347991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.527746 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.529384 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.532105 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.532357 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4m6nk" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.532497 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.533003 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.534125 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.534588 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.534619 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.539500 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608825 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608872 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608890 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608930 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608959 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.608986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.609022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.609039 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.609071 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.681799 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710599 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710658 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710766 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710911 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710941 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.710976 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711491 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.711721 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.712271 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.713006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.713022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.715556 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.715580 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3111f327615e010747f22a13f9378eff3b7d96c403da97ea4361402b1c85d196/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.716040 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.716064 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.717459 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.719583 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.732984 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.765717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.790673 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.794726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.797948 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tpvjl" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.799859 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800106 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800213 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800411 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.800540 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.833009 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.842463 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.844086 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.857742 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.859539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.882438 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.890460 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.893989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914352 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914398 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914485 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914549 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:46 crc kubenswrapper[4898]: I0313 14:18:46.914631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016197 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016452 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016503 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016521 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016540 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016564 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016626 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016648 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016665 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016682 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016701 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016719 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016803 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016840 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016892 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.016925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.017493 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.017944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.018811 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019305 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019343 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019387 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019443 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.019500 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.022072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.022462 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.024178 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.024198 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/235b7df56c251cb078c850d3b743a7085fdda6b090aa4cee8a1308b947278440/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.024644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.026092 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.034921 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.066768 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.083140 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerStarted","Data":"f4e0bf3960a9198c7dbd49808ca83e6770f6cbaf7ea545029dc2173d4eb03419"} Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.090232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerStarted","Data":"98527ac55b34245c21c2fc19c06bf03fb117bb3cf7f7b539444d59d7d2dad50b"} Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.120838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.120930 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.120956 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121298 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121417 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121446 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121497 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121514 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121541 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121560 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121588 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.121604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122237 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122615 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122636 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122649 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.122814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.123147 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.123505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.123645 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.124418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.124542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.124781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.125150 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.125172 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33865dbdc5fe61694c30892e6300309b59f04bdd0b35aa3fd0f17da3ba922194/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.126286 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.127074 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128607 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128631 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5b057b78b5a76d291625b9af6af2e0e662115b1b100b445e2e40d0ac02a65c7/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.128910 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.132714 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.132758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.132868 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.133427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.136116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.137004 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.142789 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.148632 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.159802 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.181662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.210878 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.223968 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.271499 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.504428 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.729973 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.886525 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.989344 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.991046 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.993951 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.994407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-jw2rr" Mar 13 14:18:47 crc kubenswrapper[4898]: I0313 14:18:47.998261 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.000748 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.001163 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.003751 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.008329 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.155862 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk6v5\" (UniqueName: \"kubernetes.io/projected/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kube-api-access-gk6v5\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.155928 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.155975 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156020 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156075 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.156118 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk6v5\" (UniqueName: \"kubernetes.io/projected/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kube-api-access-gk6v5\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258457 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258544 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258576 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.258625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.260984 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-default\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.261019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e5d53cf3-113e-4391-b3a9-4e1f81836e26-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.263685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.264553 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kolla-config\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.272383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.276516 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.276558 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/89901677853a048b74c0117bd23de07724e79b5f56f64f8503c2bc2692c943e7/globalmount\"" pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.289198 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d53cf3-113e-4391-b3a9-4e1f81836e26-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.294246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk6v5\" (UniqueName: \"kubernetes.io/projected/e5d53cf3-113e-4391-b3a9-4e1f81836e26-kube-api-access-gk6v5\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.325298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-528ccd43-eea9-4215-9d1e-4da3f303db3c\") pod \"openstack-galera-0\" (UID: \"e5d53cf3-113e-4391-b3a9-4e1f81836e26\") " pod="openstack/openstack-galera-0" Mar 13 14:18:48 crc kubenswrapper[4898]: I0313 14:18:48.617430 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.134180 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.134264 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.507032 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.511926 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.515464 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.516159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.516459 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-lmh2w" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.516774 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.518615 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596109 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596194 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596222 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596260 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfbqp\" (UniqueName: \"kubernetes.io/projected/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kube-api-access-dfbqp\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596300 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596413 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.596436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698476 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698530 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698554 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698609 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfbqp\" (UniqueName: \"kubernetes.io/projected/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kube-api-access-dfbqp\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.698656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.700290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.701613 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.706624 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.707829 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.709101 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.709642 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.709690 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/94535bbcb82a4e5c35e582e7068f21e1d0f0f4fb265f745dad2e0267fea0e923/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.711913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.734844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfbqp\" (UniqueName: \"kubernetes.io/projected/6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f-kube-api-access-dfbqp\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.798093 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-225af9fc-6e90-4484-ba03-b804922e0132\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-225af9fc-6e90-4484-ba03-b804922e0132\") pod \"openstack-cell1-galera-0\" (UID: \"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f\") " pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.824915 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.827892 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.833825 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.834206 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.834944 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-6q4ph" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.836673 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.845557 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.909998 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910073 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-kolla-config\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-config-data\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/67ef28b0-acc3-400e-8296-a541fc3b89f0-kube-api-access-pdjbw\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:49 crc kubenswrapper[4898]: I0313 14:18:49.910128 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.012814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.012995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-kolla-config\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.013023 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-config-data\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.013051 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/67ef28b0-acc3-400e-8296-a541fc3b89f0-kube-api-access-pdjbw\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.014098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-kolla-config\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.014197 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/67ef28b0-acc3-400e-8296-a541fc3b89f0-config-data\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.014240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.037871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-memcached-tls-certs\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.038044 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67ef28b0-acc3-400e-8296-a541fc3b89f0-combined-ca-bundle\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.047259 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdjbw\" (UniqueName: \"kubernetes.io/projected/67ef28b0-acc3-400e-8296-a541fc3b89f0-kube-api-access-pdjbw\") pod \"memcached-0\" (UID: \"67ef28b0-acc3-400e-8296-a541fc3b89f0\") " pod="openstack/memcached-0" Mar 13 14:18:50 crc kubenswrapper[4898]: I0313 14:18:50.165339 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.058306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.066367 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.068405 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-k9zs2" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.073687 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.182920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"kube-state-metrics-0\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.289985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"kube-state-metrics-0\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.319705 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"kube-state-metrics-0\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.394115 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:18:52 crc kubenswrapper[4898]: W0313 14:18:52.662204 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa7fb2f_de19_48b1_8226_d7f85a5f8f2b.slice/crio-6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2 WatchSource:0}: Error finding container 6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2: Status 404 returned error can't find the container with id 6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2 Mar 13 14:18:52 crc kubenswrapper[4898]: W0313 14:18:52.675918 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee084354_4d32_4d3c_96a4_1e4e7eef5d85.slice/crio-f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1 WatchSource:0}: Error finding container f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1: Status 404 returned error can't find the container with id f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1 Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.965891 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b"] Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.967337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.973650 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.974350 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-782s7" Mar 13 14:18:52 crc kubenswrapper[4898]: I0313 14:18:52.979127 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.106723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad052248-8fcd-4ef6-9969-5023b87bbbf9-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.107054 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6qck\" (UniqueName: \"kubernetes.io/projected/ad052248-8fcd-4ef6-9969-5023b87bbbf9-kube-api-access-h6qck\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.208346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad052248-8fcd-4ef6-9969-5023b87bbbf9-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.208410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6qck\" (UniqueName: \"kubernetes.io/projected/ad052248-8fcd-4ef6-9969-5023b87bbbf9-kube-api-access-h6qck\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.212227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerStarted","Data":"f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.220241 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad052248-8fcd-4ef6-9969-5023b87bbbf9-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.221139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerStarted","Data":"6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.224444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerStarted","Data":"6fe4bdbf2db945955ec1dd2e86e519172f05f6c43d7d6ac216668fd59e9bda42"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.228199 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerStarted","Data":"8cd6b4a73f7f67c36783e2cd3de871dd93389c4f889e74a44de4a7253a7e9a9c"} Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.240447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6qck\" (UniqueName: \"kubernetes.io/projected/ad052248-8fcd-4ef6-9969-5023b87bbbf9-kube-api-access-h6qck\") pod \"observability-ui-dashboards-66cbf594b5-gpj8b\" (UID: \"ad052248-8fcd-4ef6-9969-5023b87bbbf9\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.285245 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.302565 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-699d95d586-ds75f"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.303855 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.310606 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699d95d586-ds75f"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.353865 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.367762 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373146 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373450 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g7dw2" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373555 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373590 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.373290 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.380984 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.389250 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.405221 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414221 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414245 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414269 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414354 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k845s\" (UniqueName: \"kubernetes.io/projected/ab8664f8-1960-4442-9fdd-9711ec963e1f-kube-api-access-k845s\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414409 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-trusted-ca-bundle\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414458 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414574 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414611 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-service-ca\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414631 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414656 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-oauth-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-oauth-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.414717 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.415113 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.515955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-trusted-ca-bundle\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516117 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516148 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-service-ca\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516168 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-oauth-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-oauth-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516235 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516280 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.516375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k845s\" (UniqueName: \"kubernetes.io/projected/ab8664f8-1960-4442-9fdd-9711ec963e1f-kube-api-access-k845s\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.517570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-trusted-ca-bundle\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.518175 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.519062 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-service-ca\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.519536 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-oauth-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.519586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.520007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.520397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.520827 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.521717 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.522577 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.522741 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.522765 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7123a2111c2c1fcd673a2fa4cbaef2c14fcdb159a9a269edbe99c5cdea18ee2d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.523743 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.525443 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-oauth-config\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.534481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ab8664f8-1960-4442-9fdd-9711ec963e1f-console-serving-cert\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.538289 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k845s\" (UniqueName: \"kubernetes.io/projected/ab8664f8-1960-4442-9fdd-9711ec963e1f-kube-api-access-k845s\") pod \"console-699d95d586-ds75f\" (UID: \"ab8664f8-1960-4442-9fdd-9711ec963e1f\") " pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.539946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.545655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.578624 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.624045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:18:53 crc kubenswrapper[4898]: I0313 14:18:53.684808 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.676658 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.834635 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j79bj"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.836239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.838572 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.838594 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-74z8j" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.838937 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.843037 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.890317 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-r9tmf"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.894742 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.908845 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r9tmf"] Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960557 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-scripts\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960618 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-log-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5kxv\" (UniqueName: \"kubernetes.io/projected/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-kube-api-access-s5kxv\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960670 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-run\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-etc-ovs\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960776 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960796 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-ovn-controller-tls-certs\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960825 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-log\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960839 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6dm\" (UniqueName: \"kubernetes.io/projected/f71b72a8-f179-454c-8d2e-4ac829842622-kube-api-access-4f6dm\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71b72a8-f179-454c-8d2e-4ac829842622-scripts\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.960982 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-combined-ca-bundle\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.961495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-lib\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:54 crc kubenswrapper[4898]: I0313 14:18:54.961513 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.063877 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-log\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.063955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6dm\" (UniqueName: \"kubernetes.io/projected/f71b72a8-f179-454c-8d2e-4ac829842622-kube-api-access-4f6dm\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064024 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71b72a8-f179-454c-8d2e-4ac829842622-scripts\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-combined-ca-bundle\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064170 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-lib\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064196 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-scripts\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-log-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5kxv\" (UniqueName: \"kubernetes.io/projected/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-kube-api-access-s5kxv\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-run\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-etc-ovs\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.064473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-ovn-controller-tls-certs\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.066675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.068591 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f71b72a8-f179-454c-8d2e-4ac829842622-scripts\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.068758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-log\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.069195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-lib\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-combined-ca-bundle\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071446 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-scripts\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-etc-ovs\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f71b72a8-f179-454c-8d2e-4ac829842622-var-run\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071549 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-log-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.071597 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-var-run-ovn\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.092977 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-ovn-controller-tls-certs\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.099328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5kxv\" (UniqueName: \"kubernetes.io/projected/a506ef1a-354a-49c8-b63d-4db4b9ecdcfe-kube-api-access-s5kxv\") pod \"ovn-controller-j79bj\" (UID: \"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe\") " pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.100627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6dm\" (UniqueName: \"kubernetes.io/projected/f71b72a8-f179-454c-8d2e-4ac829842622-kube-api-access-4f6dm\") pod \"ovn-controller-ovs-r9tmf\" (UID: \"f71b72a8-f179-454c-8d2e-4ac829842622\") " pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.169053 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.216971 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.440814 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.443049 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445758 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-fkzxt" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445840 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445850 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.445770 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.446295 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.448976 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575025 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575158 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbn2v\" (UniqueName: \"kubernetes.io/projected/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-kube-api-access-vbn2v\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575182 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-config\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575212 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575227 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.575353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676783 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.676992 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbn2v\" (UniqueName: \"kubernetes.io/projected/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-kube-api-access-vbn2v\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.677016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-config\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.677050 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.677077 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.678082 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.678463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-config\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.678754 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.681143 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.681172 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f945661a94c6edc4b169d60f552ff5af0e79f3b05828be23f5404777cfe64975/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.681654 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.683098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.686476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.699386 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbn2v\" (UniqueName: \"kubernetes.io/projected/280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10-kube-api-access-vbn2v\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.721769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61cc1b1a-16a4-4a15-a961-1c115f8ab82c\") pod \"ovsdbserver-nb-0\" (UID: \"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10\") " pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:55 crc kubenswrapper[4898]: I0313 14:18:55.773119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.094883 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.098264 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.100561 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-7xr2x" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.103185 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.103483 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.103654 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.115467 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152117 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152158 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-config\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152187 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152211 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/111bf23f-be00-46ab-97fe-a36465735164-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.152894 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.153203 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66bs\" (UniqueName: \"kubernetes.io/projected/111bf23f-be00-46ab-97fe-a36465735164-kube-api-access-k66bs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.153340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254205 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-config\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/111bf23f-be00-46ab-97fe-a36465735164-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254491 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66bs\" (UniqueName: \"kubernetes.io/projected/111bf23f-be00-46ab-97fe-a36465735164-kube-api-access-k66bs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.254591 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.255937 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/111bf23f-be00-46ab-97fe-a36465735164-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.256964 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.258759 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.258792 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5552e6bada886dd6f04199b2714e9f5be58976ce3ffc4ce0948de79ca5058217/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.262577 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.264248 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.264499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/111bf23f-be00-46ab-97fe-a36465735164-config\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.270441 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/111bf23f-be00-46ab-97fe-a36465735164-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.283740 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66bs\" (UniqueName: \"kubernetes.io/projected/111bf23f-be00-46ab-97fe-a36465735164-kube-api-access-k66bs\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.305451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b47f8b75-85d2-4ff3-8129-97e8e65a1c3b\") pod \"ovsdbserver-sb-0\" (UID: \"111bf23f-be00-46ab-97fe-a36465735164\") " pod="openstack/ovsdbserver-sb-0" Mar 13 14:18:59 crc kubenswrapper[4898]: I0313 14:18:59.415250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:03 crc kubenswrapper[4898]: I0313 14:19:03.558864 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 14:19:03 crc kubenswrapper[4898]: I0313 14:19:03.677585 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.702250 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.702738 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sn79l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-hk9w4_openstack(9e544d1f-357e-4751-88bb-5108430b52cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.704406 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.707409 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.707563 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v5ll2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-nmlp6_openstack(c17db307-7a8a-4585-9696-a9ef96b6ba0b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.708825 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.709309 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.709420 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glpmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-npnz4_openstack(b5005de8-b440-45e8-a1a7-7943f68bff2f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:07 crc kubenswrapper[4898]: E0313 14:19:07.710674 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" podUID="b5005de8-b440-45e8-a1a7-7943f68bff2f" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.377027 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.377033 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" Mar 13 14:19:08 crc kubenswrapper[4898]: W0313 14:19:08.845656 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e010381_921d_4328_9027_ddb9a54a08bd.slice/crio-a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325 WatchSource:0}: Error finding container a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325: Status 404 returned error can't find the container with id a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325 Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.877381 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.877865 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h6xth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-5jhct_openstack(70dc5baf-6ae1-41b4-9454-8ff891570f8b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:19:08 crc kubenswrapper[4898]: E0313 14:19:08.879869 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" podUID="70dc5baf-6ae1-41b4-9454-8ff891570f8b" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.133506 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.213367 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") pod \"b5005de8-b440-45e8-a1a7-7943f68bff2f\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.213529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") pod \"b5005de8-b440-45e8-a1a7-7943f68bff2f\" (UID: \"b5005de8-b440-45e8-a1a7-7943f68bff2f\") " Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.215073 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config" (OuterVolumeSpecName: "config") pod "b5005de8-b440-45e8-a1a7-7943f68bff2f" (UID: "b5005de8-b440-45e8-a1a7-7943f68bff2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.219596 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk" (OuterVolumeSpecName: "kube-api-access-glpmk") pod "b5005de8-b440-45e8-a1a7-7943f68bff2f" (UID: "b5005de8-b440-45e8-a1a7-7943f68bff2f"). InnerVolumeSpecName "kube-api-access-glpmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.316133 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glpmk\" (UniqueName: \"kubernetes.io/projected/b5005de8-b440-45e8-a1a7-7943f68bff2f-kube-api-access-glpmk\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.316436 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5005de8-b440-45e8-a1a7-7943f68bff2f-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.384652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"6ec3991f9b81a553bddc7e8dd5637b2e9d2c74118a175bf76255584779d5faf2"} Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.385890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" event={"ID":"b5005de8-b440-45e8-a1a7-7943f68bff2f","Type":"ContainerDied","Data":"64fa491215cb47b1ae2d24aaa439a281949404c6099493e0af3976068bb840f1"} Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.385954 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-npnz4" Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.388887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerStarted","Data":"a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325"} Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.465602 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.473839 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-npnz4"] Mar 13 14:19:09 crc kubenswrapper[4898]: I0313 14:19:09.754935 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5005de8-b440-45e8-a1a7-7943f68bff2f" path="/var/lib/kubelet/pods/b5005de8-b440-45e8-a1a7-7943f68bff2f/volumes" Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.222825 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699d95d586-ds75f"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.238539 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.462575 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.484940 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b"] Mar 13 14:19:10 crc kubenswrapper[4898]: W0313 14:19:10.534309 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab8664f8_1960_4442_9fdd_9711ec963e1f.slice/crio-75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4 WatchSource:0}: Error finding container 75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4: Status 404 returned error can't find the container with id 75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4 Mar 13 14:19:10 crc kubenswrapper[4898]: W0313 14:19:10.641804 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf526abbc_e646_48b4_afa8_7f95f4a607a0.slice/crio-e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279 WatchSource:0}: Error finding container e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279: Status 404 returned error can't find the container with id e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279 Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.838176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.853760 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.861734 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj"] Mar 13 14:19:10 crc kubenswrapper[4898]: I0313 14:19:10.938394 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.232197 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.377465 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") pod \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.377557 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") pod \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.377650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") pod \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\" (UID: \"70dc5baf-6ae1-41b4-9454-8ff891570f8b\") " Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.378733 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "70dc5baf-6ae1-41b4-9454-8ff891570f8b" (UID: "70dc5baf-6ae1-41b4-9454-8ff891570f8b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.379224 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config" (OuterVolumeSpecName: "config") pod "70dc5baf-6ae1-41b4-9454-8ff891570f8b" (UID: "70dc5baf-6ae1-41b4-9454-8ff891570f8b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.386177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth" (OuterVolumeSpecName: "kube-api-access-h6xth") pod "70dc5baf-6ae1-41b4-9454-8ff891570f8b" (UID: "70dc5baf-6ae1-41b4-9454-8ff891570f8b"). InnerVolumeSpecName "kube-api-access-h6xth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.410339 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"111bf23f-be00-46ab-97fe-a36465735164","Type":"ContainerStarted","Data":"c49368ae5e9c18d425c8b1f12cab5cd1fe934aeb23fb0840878f1b31b1ed9ad0"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.411731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10","Type":"ContainerStarted","Data":"4267217032c3f6604cd9f5db4299e357d2c991800cdea8a6a36a9dcfc0d8c5b4"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.412762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"67ef28b0-acc3-400e-8296-a541fc3b89f0","Type":"ContainerStarted","Data":"ae3b6555dcb0c381cf3215392eba070575db3d273cf0ed579abe9ea6ca84b2d1"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.413975 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerStarted","Data":"319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.416929 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerStarted","Data":"8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.418098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d95d586-ds75f" event={"ID":"ab8664f8-1960-4442-9fdd-9711ec963e1f","Type":"ContainerStarted","Data":"75730ddce867cc2a105b1b1dc663ae54eddf209d10b2609fd0662d3aa6f813d4"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.449597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj" event={"ID":"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe","Type":"ContainerStarted","Data":"288e2eb84d62aa584721a32f93bb793b9f73c64641090a579d0f6582ae88a0dc"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.452133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"70416c4b1d2425f9af76478ec404b3bc01a480bde401890ed595660f8f4ec3f7"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.471498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerStarted","Data":"6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.476671 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerStarted","Data":"e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.479650 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6xth\" (UniqueName: \"kubernetes.io/projected/70dc5baf-6ae1-41b4-9454-8ff891570f8b-kube-api-access-h6xth\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.479675 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.479683 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70dc5baf-6ae1-41b4-9454-8ff891570f8b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.481603 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerStarted","Data":"cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.484631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" event={"ID":"70dc5baf-6ae1-41b4-9454-8ff891570f8b","Type":"ContainerDied","Data":"41cc016dc80d4b785612e7bb953345c49daec5b78f73d536749225e85b77217d"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.484754 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5jhct" Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.495448 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" event={"ID":"ad052248-8fcd-4ef6-9969-5023b87bbbf9","Type":"ContainerStarted","Data":"802713d4b4783f0e385efee7ea2662a5e8b02ad36e51a0e58b61695a8eb8808e"} Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.533101 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-r9tmf"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.589428 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.596928 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5jhct"] Mar 13 14:19:11 crc kubenswrapper[4898]: I0313 14:19:11.750190 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dc5baf-6ae1-41b4-9454-8ff891570f8b" path="/var/lib/kubelet/pods/70dc5baf-6ae1-41b4-9454-8ff891570f8b/volumes" Mar 13 14:19:13 crc kubenswrapper[4898]: W0313 14:19:13.365175 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71b72a8_f179_454c_8d2e_4ac829842622.slice/crio-8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1 WatchSource:0}: Error finding container 8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1: Status 404 returned error can't find the container with id 8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1 Mar 13 14:19:13 crc kubenswrapper[4898]: I0313 14:19:13.536874 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerStarted","Data":"8a049a1d82afb7e4921f4fb9b45ea37c110c2288d9057d078bdbde6ee61f93e1"} Mar 13 14:19:14 crc kubenswrapper[4898]: I0313 14:19:14.552031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699d95d586-ds75f" event={"ID":"ab8664f8-1960-4442-9fdd-9711ec963e1f","Type":"ContainerStarted","Data":"c65020321e952c46cfee20714212dd17d9bd0026593d0fedff27cbf44cc73c5e"} Mar 13 14:19:14 crc kubenswrapper[4898]: I0313 14:19:14.574411 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699d95d586-ds75f" podStartSLOduration=21.574393774 podStartE2EDuration="21.574393774s" podCreationTimestamp="2026-03-13 14:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:14.56849509 +0000 UTC m=+1389.570083339" watchObservedRunningTime="2026-03-13 14:19:14.574393774 +0000 UTC m=+1389.575982013" Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.587727 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"6dc134313604b68c8ab10e2c392729441b9bc5e45ec849a53ceed430e93e429c"} Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.590204 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"67ef28b0-acc3-400e-8296-a541fc3b89f0","Type":"ContainerStarted","Data":"622d6a713b41e4b6a008d0e3e85be32b17a6ccd982c87f9e945074ffe15804c7"} Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.590311 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.592553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"54cdc8923dc9b1d51b7601a430097ec42b0a894296bfedbe8fc5fc74a3adf43a"} Mar 13 14:19:18 crc kubenswrapper[4898]: I0313 14:19:18.678096 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.33474511 podStartE2EDuration="29.678064569s" podCreationTimestamp="2026-03-13 14:18:49 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.131392552 +0000 UTC m=+1386.132980791" lastFinishedPulling="2026-03-13 14:19:17.474712001 +0000 UTC m=+1392.476300250" observedRunningTime="2026-03-13 14:19:18.671501619 +0000 UTC m=+1393.673089978" watchObservedRunningTime="2026-03-13 14:19:18.678064569 +0000 UTC m=+1393.679652818" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.134368 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.134633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.134671 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.135208 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.135258 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56" gracePeriod=600 Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.602959 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerStarted","Data":"71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.603311 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.605335 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj" event={"ID":"a506ef1a-354a-49c8-b63d-4db4b9ecdcfe","Type":"ContainerStarted","Data":"b8b9af01ef6e79d983db7f1a267b8c0ddee0b64921a80f09baaea6eeb244cf39"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.605477 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-j79bj" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.611522 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" event={"ID":"ad052248-8fcd-4ef6-9969-5023b87bbbf9","Type":"ContainerStarted","Data":"715044d0f82cba4ad9ee7589454dd020782348cdb3c4d3ef7b2ac338f04494fc"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.624026 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"111bf23f-be00-46ab-97fe-a36465735164","Type":"ContainerStarted","Data":"1bdc27d4998e62d32f402f9a321a4b390dbc37da79acbfdca74ec1b98d4b82b8"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.626796 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.021975356 podStartE2EDuration="27.626777686s" podCreationTimestamp="2026-03-13 14:18:52 +0000 UTC" firstStartedPulling="2026-03-13 14:19:08.870378584 +0000 UTC m=+1383.871966823" lastFinishedPulling="2026-03-13 14:19:17.475180914 +0000 UTC m=+1392.476769153" observedRunningTime="2026-03-13 14:19:19.616885308 +0000 UTC m=+1394.618473557" watchObservedRunningTime="2026-03-13 14:19:19.626777686 +0000 UTC m=+1394.628365925" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629164 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56" exitCode=0 Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.629264 4898 scope.go:117] "RemoveContainer" containerID="7b5d3972dfd92a1b971338153ac5467cf67b2057ca35cfb382b56be42ddca2ed" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.632555 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10","Type":"ContainerStarted","Data":"67f45ce6b97f65b84a75fb6f6c2bea4e76e105439c28870beec523190f47a249"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.641172 4898 generic.go:334] "Generic (PLEG): container finished" podID="f71b72a8-f179-454c-8d2e-4ac829842622" containerID="8a94a2d331eb5334e6859d4ae92c26ca4ea0f2be05f7a83373164a2e1e6a9044" exitCode=0 Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.641304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerDied","Data":"8a94a2d331eb5334e6859d4ae92c26ca4ea0f2be05f7a83373164a2e1e6a9044"} Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.664348 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-j79bj" podStartSLOduration=19.28630987 podStartE2EDuration="25.664301821s" podCreationTimestamp="2026-03-13 14:18:54 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.141188777 +0000 UTC m=+1386.142777016" lastFinishedPulling="2026-03-13 14:19:17.519180728 +0000 UTC m=+1392.520768967" observedRunningTime="2026-03-13 14:19:19.632631558 +0000 UTC m=+1394.634219817" watchObservedRunningTime="2026-03-13 14:19:19.664301821 +0000 UTC m=+1394.665890060" Mar 13 14:19:19 crc kubenswrapper[4898]: I0313 14:19:19.671977 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-gpj8b" podStartSLOduration=21.31782317 podStartE2EDuration="27.671961901s" podCreationTimestamp="2026-03-13 14:18:52 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.131346921 +0000 UTC m=+1386.132935160" lastFinishedPulling="2026-03-13 14:19:17.485485652 +0000 UTC m=+1392.487073891" observedRunningTime="2026-03-13 14:19:19.645846061 +0000 UTC m=+1394.647434320" watchObservedRunningTime="2026-03-13 14:19:19.671961901 +0000 UTC m=+1394.673550140" Mar 13 14:19:20 crc kubenswrapper[4898]: I0313 14:19:20.654490 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerStarted","Data":"30c5b975cfcebf4558f9c261801ff9f8e434560b7c84e19be8f58b704459d95f"} Mar 13 14:19:21 crc kubenswrapper[4898]: I0313 14:19:21.667981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerStarted","Data":"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759"} Mar 13 14:19:21 crc kubenswrapper[4898]: I0313 14:19:21.668113 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" containerID="cri-o://17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" gracePeriod=600 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.682945 4898 generic.go:334] "Generic (PLEG): container finished" podID="9e544d1f-357e-4751-88bb-5108430b52cb" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.683004 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerDied","Data":"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.686840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-r9tmf" event={"ID":"f71b72a8-f179-454c-8d2e-4ac829842622","Type":"ContainerStarted","Data":"9b97742be64da9f855d3a6747c7c0629c74485bfa790a873f4b95940c778cd26"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.687088 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.691164 4898 generic.go:334] "Generic (PLEG): container finished" podID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerID="54cdc8923dc9b1d51b7601a430097ec42b0a894296bfedbe8fc5fc74a3adf43a" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.691257 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerDied","Data":"54cdc8923dc9b1d51b7601a430097ec42b0a894296bfedbe8fc5fc74a3adf43a"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.694579 4898 generic.go:334] "Generic (PLEG): container finished" podID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerID="6dc134313604b68c8ab10e2c392729441b9bc5e45ec849a53ceed430e93e429c" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.694675 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerDied","Data":"6dc134313604b68c8ab10e2c392729441b9bc5e45ec849a53ceed430e93e429c"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.696700 4898 generic.go:334] "Generic (PLEG): container finished" podID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" exitCode=0 Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.696752 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerDied","Data":"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.702380 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"111bf23f-be00-46ab-97fe-a36465735164","Type":"ContainerStarted","Data":"533a46a37109aea2ff8c47c6024b08c433252c4df0234d2dfe06d45bb30ec92e"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.707703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10","Type":"ContainerStarted","Data":"39bfdffc1e0fb88186d147830302dd4ca12b23b3d4541787d52288ae77f2c1f5"} Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.763853 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-r9tmf" podStartSLOduration=24.646436727 podStartE2EDuration="28.76383075s" podCreationTimestamp="2026-03-13 14:18:54 +0000 UTC" firstStartedPulling="2026-03-13 14:19:13.367597956 +0000 UTC m=+1388.369186185" lastFinishedPulling="2026-03-13 14:19:17.484991969 +0000 UTC m=+1392.486580208" observedRunningTime="2026-03-13 14:19:22.750944815 +0000 UTC m=+1397.752533084" watchObservedRunningTime="2026-03-13 14:19:22.76383075 +0000 UTC m=+1397.765418989" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.775508 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.789283 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.934631386 podStartE2EDuration="28.789262422s" podCreationTimestamp="2026-03-13 14:18:54 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.141478495 +0000 UTC m=+1386.143066734" lastFinishedPulling="2026-03-13 14:19:21.996109541 +0000 UTC m=+1396.997697770" observedRunningTime="2026-03-13 14:19:22.776563981 +0000 UTC m=+1397.778152240" watchObservedRunningTime="2026-03-13 14:19:22.789262422 +0000 UTC m=+1397.790850671" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.845812 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.934942373 podStartE2EDuration="24.845794872s" podCreationTimestamp="2026-03-13 14:18:58 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.145293434 +0000 UTC m=+1386.146881673" lastFinishedPulling="2026-03-13 14:19:22.056145933 +0000 UTC m=+1397.057734172" observedRunningTime="2026-03-13 14:19:22.83610698 +0000 UTC m=+1397.837695239" watchObservedRunningTime="2026-03-13 14:19:22.845794872 +0000 UTC m=+1397.847383111" Mar 13 14:19:22 crc kubenswrapper[4898]: I0313 14:19:22.848858 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.416207 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.472464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.625226 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.625285 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.629590 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.720493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.722379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.724550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerStarted","Data":"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.724782 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728059 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerStarted","Data":"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63"} Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728684 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728721 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.728744 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.734077 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-699d95d586-ds75f" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.753195 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.257275855 podStartE2EDuration="35.753165493s" podCreationTimestamp="2026-03-13 14:18:48 +0000 UTC" firstStartedPulling="2026-03-13 14:19:11.135811237 +0000 UTC m=+1386.137399476" lastFinishedPulling="2026-03-13 14:19:16.631700875 +0000 UTC m=+1391.633289114" observedRunningTime="2026-03-13 14:19:23.752962558 +0000 UTC m=+1398.754550837" watchObservedRunningTime="2026-03-13 14:19:23.753165493 +0000 UTC m=+1398.754753732" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.783421 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.029049091 podStartE2EDuration="37.783405s" podCreationTimestamp="2026-03-13 14:18:46 +0000 UTC" firstStartedPulling="2026-03-13 14:19:08.869949643 +0000 UTC m=+1383.871537882" lastFinishedPulling="2026-03-13 14:19:14.624305552 +0000 UTC m=+1389.625893791" observedRunningTime="2026-03-13 14:19:23.777330812 +0000 UTC m=+1398.778919081" watchObservedRunningTime="2026-03-13 14:19:23.783405 +0000 UTC m=+1398.784993229" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.787531 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.790449 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.842573 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podStartSLOduration=3.833183793 podStartE2EDuration="38.842550078s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:46.691640601 +0000 UTC m=+1361.693228840" lastFinishedPulling="2026-03-13 14:19:21.701006886 +0000 UTC m=+1396.702595125" observedRunningTime="2026-03-13 14:19:23.814743565 +0000 UTC m=+1398.816331814" watchObservedRunningTime="2026-03-13 14:19:23.842550078 +0000 UTC m=+1398.844138307" Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.861953 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:19:23 crc kubenswrapper[4898]: I0313 14:19:23.863541 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podStartSLOduration=-9223371997.991253 podStartE2EDuration="38.863523034s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:46.354460331 +0000 UTC m=+1361.356048570" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:23.845835753 +0000 UTC m=+1398.847424002" watchObservedRunningTime="2026-03-13 14:19:23.863523034 +0000 UTC m=+1398.865111273" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.136338 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.191451 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8mxxb"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.193047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.195212 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.202441 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.204076 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.214007 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.214610 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8mxxb"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.228717 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.330596 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.343989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344082 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovn-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2hr\" (UniqueName: \"kubernetes.io/projected/515cda05-1d7b-4252-94fc-056b38ec502a-kube-api-access-zn2hr\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovs-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515cda05-1d7b-4252-94fc-056b38ec502a-config\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.344326 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-combined-ca-bundle\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.381072 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.382949 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.385488 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.397013 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.424748 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.431868 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.436100 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-nwbfn" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.436262 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.436414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.446242 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447194 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447226 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-combined-ca-bundle\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447283 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447344 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447367 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovn-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.447415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2hr\" (UniqueName: \"kubernetes.io/projected/515cda05-1d7b-4252-94fc-056b38ec502a-kube-api-access-zn2hr\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448272 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovs-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.448294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515cda05-1d7b-4252-94fc-056b38ec502a-config\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.449081 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/515cda05-1d7b-4252-94fc-056b38ec502a-config\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.449489 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovs-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.449827 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/515cda05-1d7b-4252-94fc-056b38ec502a-ovn-rundir\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.450558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.450962 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.452452 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.454331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.466024 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515cda05-1d7b-4252-94fc-056b38ec502a-combined-ca-bundle\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.469692 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.470393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"dnsmasq-dns-7fd796d7df-gm2pz\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.484568 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2hr\" (UniqueName: \"kubernetes.io/projected/515cda05-1d7b-4252-94fc-056b38ec502a-kube-api-access-zn2hr\") pod \"ovn-controller-metrics-8mxxb\" (UID: \"515cda05-1d7b-4252-94fc-056b38ec502a\") " pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.522122 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8mxxb" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.533453 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549609 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549688 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549716 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549734 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.549979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550121 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550175 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-scripts\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550299 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-config\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550394 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px22g\" (UniqueName: \"kubernetes.io/projected/902753c9-2101-4509-9283-55070ac3787e-kube-api-access-px22g\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550420 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.550979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.551215 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.551278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/902753c9-2101-4509-9283-55070ac3787e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.551485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.569103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"dnsmasq-dns-86db49b7ff-x9tcr\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px22g\" (UniqueName: \"kubernetes.io/projected/902753c9-2101-4509-9283-55070ac3787e-kube-api-access-px22g\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652731 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/902753c9-2101-4509-9283-55070ac3787e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652800 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652845 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.652880 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.653015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-scripts\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.653077 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-config\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.654213 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-config\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.654494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/902753c9-2101-4509-9283-55070ac3787e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.657432 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/902753c9-2101-4509-9283-55070ac3787e-scripts\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.658016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.658509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.662166 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902753c9-2101-4509-9283-55070ac3787e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.697615 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px22g\" (UniqueName: \"kubernetes.io/projected/902753c9-2101-4509-9283-55070ac3787e-kube-api-access-px22g\") pod \"ovn-northd-0\" (UID: \"902753c9-2101-4509-9283-55070ac3787e\") " pod="openstack/ovn-northd-0" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.716333 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.746684 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" containerID="cri-o://11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" gracePeriod=10 Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.748317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:19:24 crc kubenswrapper[4898]: I0313 14:19:24.757474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.094032 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8mxxb"] Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.101025 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515cda05_1d7b_4252_94fc_056b38ec502a.slice/crio-9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b WatchSource:0}: Error finding container 9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b: Status 404 returned error can't find the container with id 9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.110136 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.119693 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcddd3df9_e505_4f25_988d_8cba87eaefbe.slice/crio-e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd WatchSource:0}: Error finding container e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd: Status 404 returned error can't find the container with id e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.174879 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.283526 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.298145 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00d79476_a8c0_4bad_81ae_6b50afea8601.slice/crio-feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d WatchSource:0}: Error finding container feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d: Status 404 returned error can't find the container with id feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.353525 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.470880 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.484389 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") pod \"9e544d1f-357e-4751-88bb-5108430b52cb\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " Mar 13 14:19:25 crc kubenswrapper[4898]: W0313 14:19:25.484484 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902753c9_2101_4509_9283_55070ac3787e.slice/crio-d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104 WatchSource:0}: Error finding container d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104: Status 404 returned error can't find the container with id d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.484523 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") pod \"9e544d1f-357e-4751-88bb-5108430b52cb\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.484638 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") pod \"9e544d1f-357e-4751-88bb-5108430b52cb\" (UID: \"9e544d1f-357e-4751-88bb-5108430b52cb\") " Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.488350 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l" (OuterVolumeSpecName: "kube-api-access-sn79l") pod "9e544d1f-357e-4751-88bb-5108430b52cb" (UID: "9e544d1f-357e-4751-88bb-5108430b52cb"). InnerVolumeSpecName "kube-api-access-sn79l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.540134 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config" (OuterVolumeSpecName: "config") pod "9e544d1f-357e-4751-88bb-5108430b52cb" (UID: "9e544d1f-357e-4751-88bb-5108430b52cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.543923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e544d1f-357e-4751-88bb-5108430b52cb" (UID: "9e544d1f-357e-4751-88bb-5108430b52cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.586863 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.586912 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e544d1f-357e-4751-88bb-5108430b52cb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.586925 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn79l\" (UniqueName: \"kubernetes.io/projected/9e544d1f-357e-4751-88bb-5108430b52cb-kube-api-access-sn79l\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.754978 4898 generic.go:334] "Generic (PLEG): container finished" podID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerID="31ee06ecd554c7204a7ea9b6ed5158bbdc38532b41e7043447da0dc27f024036" exitCode=0 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.755071 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerDied","Data":"31ee06ecd554c7204a7ea9b6ed5158bbdc38532b41e7043447da0dc27f024036"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.755125 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerStarted","Data":"e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.759949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8mxxb" event={"ID":"515cda05-1d7b-4252-94fc-056b38ec502a","Type":"ContainerStarted","Data":"ccd672988c388ccd2ea73f95bc83004ef499dd26465d5aa436d1c9ff89369cce"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.760037 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8mxxb" event={"ID":"515cda05-1d7b-4252-94fc-056b38ec502a","Type":"ContainerStarted","Data":"9c740788f5ec1b4344a55a31830a61a598669277be8884a983f0c8bc3cb2564b"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.762309 4898 generic.go:334] "Generic (PLEG): container finished" podID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" exitCode=0 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.762434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerDied","Data":"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.762482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerStarted","Data":"feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.765529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"902753c9-2101-4509-9283-55070ac3787e","Type":"ContainerStarted","Data":"d03fba021bd551e23b93891ec34c8044d3000501b33c6bdfc21ec4cabe76a104"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.768891 4898 generic.go:334] "Generic (PLEG): container finished" podID="9e544d1f-357e-4751-88bb-5108430b52cb" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" exitCode=0 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.769934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerDied","Data":"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.769999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" event={"ID":"9e544d1f-357e-4751-88bb-5108430b52cb","Type":"ContainerDied","Data":"f4e0bf3960a9198c7dbd49808ca83e6770f6cbaf7ea545029dc2173d4eb03419"} Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.770025 4898 scope.go:117] "RemoveContainer" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.770223 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hk9w4" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.771491 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" containerID="cri-o://16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" gracePeriod=10 Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.959324 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8mxxb" podStartSLOduration=1.959293324 podStartE2EDuration="1.959293324s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:25.902508727 +0000 UTC m=+1400.904096966" watchObservedRunningTime="2026-03-13 14:19:25.959293324 +0000 UTC m=+1400.960881563" Mar 13 14:19:25 crc kubenswrapper[4898]: I0313 14:19:25.991099 4898 scope.go:117] "RemoveContainer" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.002985 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.011034 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hk9w4"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.077363 4898 scope.go:117] "RemoveContainer" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.078217 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63\": container with ID starting with 11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63 not found: ID does not exist" containerID="11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.078260 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63"} err="failed to get container status \"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63\": rpc error: code = NotFound desc = could not find container \"11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63\": container with ID starting with 11c41f241d98f2324ec56eb2b8e01ca41655cf9c167ff144591e45aba4793e63 not found: ID does not exist" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.078286 4898 scope.go:117] "RemoveContainer" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.078701 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8\": container with ID starting with 721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8 not found: ID does not exist" containerID="721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.078727 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8"} err="failed to get container status \"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8\": rpc error: code = NotFound desc = could not find container \"721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8\": container with ID starting with 721483b50e427e53b8e6dda1265ac58f303bb7f78b9d071fabeacfc95fd162d8 not found: ID does not exist" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.293411 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.405469 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") pod \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.405572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") pod \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.405676 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") pod \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\" (UID: \"c17db307-7a8a-4585-9696-a9ef96b6ba0b\") " Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.409865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2" (OuterVolumeSpecName: "kube-api-access-v5ll2") pod "c17db307-7a8a-4585-9696-a9ef96b6ba0b" (UID: "c17db307-7a8a-4585-9696-a9ef96b6ba0b"). InnerVolumeSpecName "kube-api-access-v5ll2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.452144 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config" (OuterVolumeSpecName: "config") pod "c17db307-7a8a-4585-9696-a9ef96b6ba0b" (UID: "c17db307-7a8a-4585-9696-a9ef96b6ba0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.452194 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c17db307-7a8a-4585-9696-a9ef96b6ba0b" (UID: "c17db307-7a8a-4585-9696-a9ef96b6ba0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.507986 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5ll2\" (UniqueName: \"kubernetes.io/projected/c17db307-7a8a-4585-9696-a9ef96b6ba0b-kube-api-access-v5ll2\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.508330 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.508345 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c17db307-7a8a-4585-9696-a9ef96b6ba0b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.789060 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerStarted","Data":"7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.789145 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.800006 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerStarted","Data":"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.800883 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803261 4898 generic.go:334] "Generic (PLEG): container finished" podID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" exitCode=0 Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803305 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerDied","Data":"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nmlp6" event={"ID":"c17db307-7a8a-4585-9696-a9ef96b6ba0b","Type":"ContainerDied","Data":"98527ac55b34245c21c2fc19c06bf03fb117bb3cf7f7b539444d59d7d2dad50b"} Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.803420 4898 scope.go:117] "RemoveContainer" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.815171 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" podStartSLOduration=2.815149755 podStartE2EDuration="2.815149755s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:26.807729612 +0000 UTC m=+1401.809317881" watchObservedRunningTime="2026-03-13 14:19:26.815149755 +0000 UTC m=+1401.816737994" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.834928 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" podStartSLOduration=2.834891948 podStartE2EDuration="2.834891948s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:26.833642646 +0000 UTC m=+1401.835230885" watchObservedRunningTime="2026-03-13 14:19:26.834891948 +0000 UTC m=+1401.836480187" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.863073 4898 scope.go:117] "RemoveContainer" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.863749 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.872726 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nmlp6"] Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883056 4898 scope.go:117] "RemoveContainer" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.883362 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11\": container with ID starting with 16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11 not found: ID does not exist" containerID="16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883401 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11"} err="failed to get container status \"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11\": rpc error: code = NotFound desc = could not find container \"16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11\": container with ID starting with 16a8ad8751e35be84c16e519cb59d98833bb09657571eebe84f43f80ba363a11 not found: ID does not exist" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883425 4898 scope.go:117] "RemoveContainer" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" Mar 13 14:19:26 crc kubenswrapper[4898]: E0313 14:19:26.883698 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459\": container with ID starting with 74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459 not found: ID does not exist" containerID="74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459" Mar 13 14:19:26 crc kubenswrapper[4898]: I0313 14:19:26.883792 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459"} err="failed to get container status \"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459\": rpc error: code = NotFound desc = could not find container \"74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459\": container with ID starting with 74dc3b0f08d38afbb847328da234ec143498d235c31e1d23da8c5f86f5ed4459 not found: ID does not exist" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.749776 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" path="/var/lib/kubelet/pods/9e544d1f-357e-4751-88bb-5108430b52cb/volumes" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.750934 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" path="/var/lib/kubelet/pods/c17db307-7a8a-4585-9696-a9ef96b6ba0b/volumes" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.803033 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823786 4898 generic.go:334] "Generic (PLEG): container finished" podID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" exitCode=0 Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerDied","Data":"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f526abbc-e646-48b4-afa8-7f95f4a607a0","Type":"ContainerDied","Data":"e971de8bfe5dba96efe9215fb1b5480439450e667cc052958c51747ab17b2279"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.823892 4898 scope.go:117] "RemoveContainer" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.824072 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.829998 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"902753c9-2101-4509-9283-55070ac3787e","Type":"ContainerStarted","Data":"1bdae4bc751b2513b85e4ba091ea769293b3833999c8d6cb0b7db96c8bc2c834"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.830231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"902753c9-2101-4509-9283-55070ac3787e","Type":"ContainerStarted","Data":"5488e748631ff0ff74dc6fc92126c866c6ea137b2852a96488c1bb4a5888ebfa"} Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.830735 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836020 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836326 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.836395 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837798 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837953 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.837989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.838018 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") pod \"f526abbc-e646-48b4-afa8-7f95f4a607a0\" (UID: \"f526abbc-e646-48b4-afa8-7f95f4a607a0\") " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.840256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv" (OuterVolumeSpecName: "kube-api-access-68mzv") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "kube-api-access-68mzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.845013 4898 scope.go:117] "RemoveContainer" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" Mar 13 14:19:27 crc kubenswrapper[4898]: E0313 14:19:27.847422 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759\": container with ID starting with 17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759 not found: ID does not exist" containerID="17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.847463 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759"} err="failed to get container status \"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759\": rpc error: code = NotFound desc = could not find container \"17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759\": container with ID starting with 17895f997a596c3c89bff64ca8cbbbc2b8be5cd3c6e6642232e8f78d56b48759 not found: ID does not exist" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.848743 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68mzv\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-kube-api-access-68mzv\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.851407 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.853164 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config" (OuterVolumeSpecName: "web-config") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.854741 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out" (OuterVolumeSpecName: "config-out") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.855463 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.856383 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.856471 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.872261 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config" (OuterVolumeSpecName: "config") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.876298 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.885611 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "f526abbc-e646-48b4-afa8-7f95f4a607a0" (UID: "f526abbc-e646-48b4-afa8-7f95f4a607a0"). InnerVolumeSpecName "pvc-537e992e-0c7e-4e28-8105-b535a72a793c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.888596 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.491753353 podStartE2EDuration="3.888457141s" podCreationTimestamp="2026-03-13 14:19:24 +0000 UTC" firstStartedPulling="2026-03-13 14:19:25.488972651 +0000 UTC m=+1400.490560890" lastFinishedPulling="2026-03-13 14:19:26.885676439 +0000 UTC m=+1401.887264678" observedRunningTime="2026-03-13 14:19:27.865300879 +0000 UTC m=+1402.866889118" watchObservedRunningTime="2026-03-13 14:19:27.888457141 +0000 UTC m=+1402.890045380" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951496 4898 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f526abbc-e646-48b4-afa8-7f95f4a607a0-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951529 4898 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951559 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951571 4898 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f526abbc-e646-48b4-afa8-7f95f4a607a0-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951594 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" " Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951605 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951615 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f526abbc-e646-48b4-afa8-7f95f4a607a0-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951624 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.951634 4898 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f526abbc-e646-48b4-afa8-7f95f4a607a0-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.976087 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:19:27 crc kubenswrapper[4898]: I0313 14:19:27.976257 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c") on node "crc" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.053742 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.220975 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.224957 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.254874 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258431 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258458 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258467 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258474 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258501 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258508 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258523 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258529 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="init" Mar 13 14:19:28 crc kubenswrapper[4898]: E0313 14:19:28.258542 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258548 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258727 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e544d1f-357e-4751-88bb-5108430b52cb" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258747 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" containerName="init-config-reloader" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.258757 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17db307-7a8a-4585-9696-a9ef96b6ba0b" containerName="dnsmasq-dns" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.260487 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268158 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268237 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g7dw2" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268258 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268791 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268946 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.269049 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.268870 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.274183 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.290956 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361003 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361379 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361650 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361674 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.361751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463522 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463731 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.463827 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.464733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.465063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.465214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.467970 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.468016 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7123a2111c2c1fcd673a2fa4cbaef2c14fcdb159a9a269edbe99c5cdea18ee2d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.470580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.471043 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.471940 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.472006 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.472290 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.487643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.514016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.578476 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.618447 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 14:19:28 crc kubenswrapper[4898]: I0313 14:19:28.618676 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 14:19:29 crc kubenswrapper[4898]: W0313 14:19:29.171591 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6f6f0d_db24_4fdb_a872_ce2c527a791b.slice/crio-0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d WatchSource:0}: Error finding container 0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d: Status 404 returned error can't find the container with id 0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.172343 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.751396 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f526abbc-e646-48b4-afa8-7f95f4a607a0" path="/var/lib/kubelet/pods/f526abbc-e646-48b4-afa8-7f95f4a607a0/volumes" Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.846317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.846363 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.852239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d"} Mar 13 14:19:29 crc kubenswrapper[4898]: I0313 14:19:29.937825 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:30 crc kubenswrapper[4898]: I0313 14:19:30.937852 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 14:19:31 crc kubenswrapper[4898]: I0313 14:19:31.261077 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 14:19:31 crc kubenswrapper[4898]: I0313 14:19:31.396401 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.423367 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.423607 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" containerID="cri-o://7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234" gracePeriod=10 Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.428997 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.460589 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.461836 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.467163 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.472343 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.473751 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.488844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.497945 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.519038 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583108 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583303 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.583333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.585976 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.589285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.637017 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684797 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684859 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684907 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.684964 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685047 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685069 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.685780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.686651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.707605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"mysqld-exporter-openstack-db-create-n7qmc\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.718476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"mysqld-exporter-baf6-account-create-update-xhptm\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786607 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786690 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786796 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.786844 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.787629 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.787644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.787892 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.788068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.796932 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.807704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"dnsmasq-dns-698758b865-2sp5q\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.833386 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.918837 4898 generic.go:334] "Generic (PLEG): container finished" podID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerID="7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234" exitCode=0 Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.919144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerDied","Data":"7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234"} Mar 13 14:19:32 crc kubenswrapper[4898]: I0313 14:19:32.959276 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:33 crc kubenswrapper[4898]: W0313 14:19:33.354499 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f4ee1a_c4d2_415d_9021_6503f03f8441.slice/crio-fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd WatchSource:0}: Error finding container fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd: Status 404 returned error can't find the container with id fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.354689 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.505028 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:19:33 crc kubenswrapper[4898]: W0313 14:19:33.507168 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45215dff_dfeb_4b68_bc5c_d36aba0ea6b8.slice/crio-950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7 WatchSource:0}: Error finding container 950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7: Status 404 returned error can't find the container with id 950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7 Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.549663 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:19:33 crc kubenswrapper[4898]: W0313 14:19:33.553214 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda37db268_4fcb_45a7_a7bf_fae19a514257.slice/crio-da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5 WatchSource:0}: Error finding container da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5: Status 404 returned error can't find the container with id da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5 Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.564161 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.570031 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.574965 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.581223 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.581465 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-22jvh" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.591272 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.636489 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.716761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717015 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75f7z\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-kube-api-access-75f7z\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717185 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-lock\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794bd82b-e289-4b31-b0cf-f1285452e783-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717555 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.717738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-cache\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75f7z\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-kube-api-access-75f7z\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-lock\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794bd82b-e289-4b31-b0cf-f1285452e783-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.821701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-cache\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.822514 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-lock\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: E0313 14:19:33.822956 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:33 crc kubenswrapper[4898]: E0313 14:19:33.822982 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:33 crc kubenswrapper[4898]: E0313 14:19:33.823031 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:34.323013767 +0000 UTC m=+1409.324602076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.823545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/794bd82b-e289-4b31-b0cf-f1285452e783-cache\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.836800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794bd82b-e289-4b31-b0cf-f1285452e783-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.856080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75f7z\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-kube-api-access-75f7z\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.884053 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.884103 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9562655cbd9f5053ff4fcbaf6bf6208908fded8dd99047d18c74ed262e26381a/globalmount\"" pod="openstack/swift-storage-0" Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.937303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerStarted","Data":"da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.938921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" event={"ID":"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8","Type":"ContainerStarted","Data":"950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.940246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" event={"ID":"81f4ee1a-c4d2-415d-9021-6503f03f8441","Type":"ContainerStarted","Data":"fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.941622 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} Mar 13 14:19:33 crc kubenswrapper[4898]: I0313 14:19:33.959735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c67178b7-226e-4a46-a7b2-f53e47faeb2b\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.135556 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.137174 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.139159 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.139340 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.139635 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.168760 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.183097 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ztbp9"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.185590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.200534 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.201461 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-q4bwg ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-m9wx7" podUID="28c184e1-ac85-4c3b-b138-3b728eb97ca3" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.212706 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ztbp9"] Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.233617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.233690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.233975 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234187 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.234393 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336226 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336276 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336294 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336329 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336361 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336379 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336467 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336569 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336624 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.336658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.337135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.337265 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.337288 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:34 crc kubenswrapper[4898]: E0313 14:19:34.337335 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:35.337316954 +0000 UTC m=+1410.338905273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.338098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.338226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.341414 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.341777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.342070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.357212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"swift-ring-rebalance-m9wx7\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439442 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439547 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439630 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.439727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.442770 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.443884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.444383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.445713 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.451212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.452069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.471672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"swift-ring-rebalance-ztbp9\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.543503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.719166 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.759999 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.856363 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.856649 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.856757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.857123 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") pod \"cddd3df9-e505-4f25-988d-8cba87eaefbe\" (UID: \"cddd3df9-e505-4f25-988d-8cba87eaefbe\") " Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.871303 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst" (OuterVolumeSpecName: "kube-api-access-cdcst") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "kube-api-access-cdcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.912977 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.932720 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config" (OuterVolumeSpecName: "config") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.949307 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cddd3df9-e505-4f25-988d-8cba87eaefbe" (UID: "cddd3df9-e505-4f25-988d-8cba87eaefbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.957455 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" event={"ID":"cddd3df9-e505-4f25-988d-8cba87eaefbe","Type":"ContainerDied","Data":"e6844355e00ec9e8aa5161a17239acb337d4e49d9f07d0b8c54a762df1aef8dd"} Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.957524 4898 scope.go:117] "RemoveContainer" containerID="7b56acf87143f75eb70ec0471ca713a46b1d9ecd5c78e828b6dff4507ed29234" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.957692 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962013 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdcst\" (UniqueName: \"kubernetes.io/projected/cddd3df9-e505-4f25-988d-8cba87eaefbe-kube-api-access-cdcst\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962097 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962236 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962361 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cddd3df9-e505-4f25-988d-8cba87eaefbe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962372 4898 generic.go:334] "Generic (PLEG): container finished" podID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerID="de23e3ccb82eedd2170e1cd3b17cd796af8186141a4e3a6a0df25cf87c1ac689" exitCode=0 Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.962392 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerDied","Data":"de23e3ccb82eedd2170e1cd3b17cd796af8186141a4e3a6a0df25cf87c1ac689"} Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.971587 4898 generic.go:334] "Generic (PLEG): container finished" podID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerID="190741a9e70699bd53ad4219ca7d5f504afce181f13fef8f81fc17d9e1a70095" exitCode=0 Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.971667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" event={"ID":"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8","Type":"ContainerDied","Data":"190741a9e70699bd53ad4219ca7d5f504afce181f13fef8f81fc17d9e1a70095"} Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.973408 4898 generic.go:334] "Generic (PLEG): container finished" podID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerID="e6b8ff442a61a4fcf1565b308b6597fe09fb7264763f211d7540bb2a249e6c54" exitCode=0 Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.973477 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:34 crc kubenswrapper[4898]: I0313 14:19:34.974213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" event={"ID":"81f4ee1a-c4d2-415d-9021-6503f03f8441","Type":"ContainerDied","Data":"e6b8ff442a61a4fcf1565b308b6597fe09fb7264763f211d7540bb2a249e6c54"} Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.082918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.092860 4898 scope.go:117] "RemoveContainer" containerID="31ee06ecd554c7204a7ea9b6ed5158bbdc38532b41e7043447da0dc27f024036" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.131938 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ztbp9"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.141413 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.151642 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-gm2pz"] Mar 13 14:19:35 crc kubenswrapper[4898]: W0313 14:19:35.154293 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a6f0bfb_5db5_440c_a93f_0d6fe159401d.slice/crio-7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce WatchSource:0}: Error finding container 7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce: Status 404 returned error can't find the container with id 7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169360 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169486 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169661 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169752 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.169890 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.170022 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") pod \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\" (UID: \"28c184e1-ac85-4c3b-b138-3b728eb97ca3\") " Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.170626 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts" (OuterVolumeSpecName: "scripts") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.170662 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.172566 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.174455 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.175108 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg" (OuterVolumeSpecName: "kube-api-access-q4bwg") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "kube-api-access-q4bwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.175576 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.176289 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28c184e1-ac85-4c3b-b138-3b728eb97ca3" (UID: "28c184e1-ac85-4c3b-b138-3b728eb97ca3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272038 4898 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272075 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272091 4898 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272103 4898 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28c184e1-ac85-4c3b-b138-3b728eb97ca3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272114 4898 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28c184e1-ac85-4c3b-b138-3b728eb97ca3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4bwg\" (UniqueName: \"kubernetes.io/projected/28c184e1-ac85-4c3b-b138-3b728eb97ca3-kube-api-access-q4bwg\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.272136 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28c184e1-ac85-4c3b-b138-3b728eb97ca3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.373326 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.373853 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="init" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.373876 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="init" Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.373950 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.373969 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.374225 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.374302 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.374522 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.374558 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:35 crc kubenswrapper[4898]: E0313 14:19:35.374622 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:37.374602394 +0000 UTC m=+1412.376190633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.375131 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.383842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.476077 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.476202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.481836 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.483602 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.485627 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.492283 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.578706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.579235 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.597090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"glance-db-create-cdnq7\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.681019 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.681178 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.681945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.698929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"glance-621e-account-create-update-dksd9\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.709090 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.763354 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" path="/var/lib/kubelet/pods/cddd3df9-e505-4f25-988d-8cba87eaefbe/volumes" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.803762 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.996224 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerStarted","Data":"7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce"} Mar 13 14:19:35 crc kubenswrapper[4898]: I0313 14:19:35.999213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerStarted","Data":"50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7"} Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:35.999420 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-m9wx7" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:35.999589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.020693 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-2sp5q" podStartSLOduration=4.020668958 podStartE2EDuration="4.020668958s" podCreationTimestamp="2026-03-13 14:19:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:36.014108898 +0000 UTC m=+1411.015697157" watchObservedRunningTime="2026-03-13 14:19:36.020668958 +0000 UTC m=+1411.022257197" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.079734 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.094919 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-m9wx7"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.202259 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.411989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.794383 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.798124 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.955837 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") pod \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.956448 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") pod \"81f4ee1a-c4d2-415d-9021-6503f03f8441\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.956542 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") pod \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\" (UID: \"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.956602 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") pod \"81f4ee1a-c4d2-415d-9021-6503f03f8441\" (UID: \"81f4ee1a-c4d2-415d-9021-6503f03f8441\") " Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.957509 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81f4ee1a-c4d2-415d-9021-6503f03f8441" (UID: "81f4ee1a-c4d2-415d-9021-6503f03f8441"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.957615 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" (UID: "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.958100 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81f4ee1a-c4d2-415d-9021-6503f03f8441-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.958125 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.962196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp" (OuterVolumeSpecName: "kube-api-access-9b8hp") pod "81f4ee1a-c4d2-415d-9021-6503f03f8441" (UID: "81f4ee1a-c4d2-415d-9021-6503f03f8441"). InnerVolumeSpecName "kube-api-access-9b8hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:36 crc kubenswrapper[4898]: I0313 14:19:36.971500 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl" (OuterVolumeSpecName: "kube-api-access-8qgsl") pod "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" (UID: "45215dff-dfeb-4b68-bc5c-d36aba0ea6b8"). InnerVolumeSpecName "kube-api-access-8qgsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.008731 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.009468 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerName="mariadb-database-create" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.009598 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerName="mariadb-database-create" Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.009681 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerName="mariadb-account-create-update" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.009773 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerName="mariadb-account-create-update" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.010107 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" containerName="mariadb-database-create" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.010227 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" containerName="mariadb-account-create-update" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.011695 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.015863 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.021243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.028965 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.028994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-n7qmc" event={"ID":"45215dff-dfeb-4b68-bc5c-d36aba0ea6b8","Type":"ContainerDied","Data":"950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.032491 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950bfa55031ccf3b1124b837e6461f7251be7748323abc050c86691a891901f7" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.040682 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" event={"ID":"81f4ee1a-c4d2-415d-9021-6503f03f8441","Type":"ContainerDied","Data":"fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.040731 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc5174c2bf5424ca7939a87e54c12b86416a2de52f15aa0500110576232c34fd" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.040800 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-baf6-account-create-update-xhptm" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.047669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerStarted","Data":"1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.047741 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerStarted","Data":"00edd84a4cb86280e20ee727db08d5b81851d9132863c79d667d12f0d3c65999"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.052082 4898 generic.go:334] "Generic (PLEG): container finished" podID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerID="9199e9b5bfad44aa55bebdeb17820a815d88bcb2d174de10793bf2e6e2845fc2" exitCode=0 Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.052234 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cdnq7" event={"ID":"59bdafe7-9c43-4acc-a212-864bdf38d5b4","Type":"ContainerDied","Data":"9199e9b5bfad44aa55bebdeb17820a815d88bcb2d174de10793bf2e6e2845fc2"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.052325 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cdnq7" event={"ID":"59bdafe7-9c43-4acc-a212-864bdf38d5b4","Type":"ContainerStarted","Data":"341d71b9d52042a949c7679712659dfd1f85d596c533f795b07d24d74bb3c431"} Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.063658 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-621e-account-create-update-dksd9" podStartSLOduration=2.063631705 podStartE2EDuration="2.063631705s" podCreationTimestamp="2026-03-13 14:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:37.06306678 +0000 UTC m=+1412.064655029" watchObservedRunningTime="2026-03-13 14:19:37.063631705 +0000 UTC m=+1412.065219954" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.064476 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b8hp\" (UniqueName: \"kubernetes.io/projected/81f4ee1a-c4d2-415d-9021-6503f03f8441-kube-api-access-9b8hp\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.064509 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qgsl\" (UniqueName: \"kubernetes.io/projected/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8-kube-api-access-8qgsl\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.166421 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.166623 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.268455 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.268617 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.269337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.285599 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"root-account-create-update-55n8q\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.391959 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.474356 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.474540 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.474581 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:37 crc kubenswrapper[4898]: E0313 14:19:37.474653 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:41.474632515 +0000 UTC m=+1416.476220754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:37 crc kubenswrapper[4898]: I0313 14:19:37.757426 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c184e1-ac85-4c3b-b138-3b728eb97ca3" path="/var/lib/kubelet/pods/28c184e1-ac85-4c3b-b138-3b728eb97ca3/volumes" Mar 13 14:19:38 crc kubenswrapper[4898]: I0313 14:19:38.063727 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerID="1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff" exitCode=0 Mar 13 14:19:38 crc kubenswrapper[4898]: I0313 14:19:38.063768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerDied","Data":"1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff"} Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.458890 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.516108 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") pod \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.516242 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") pod \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\" (UID: \"59bdafe7-9c43-4acc-a212-864bdf38d5b4\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.517320 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59bdafe7-9c43-4acc-a212-864bdf38d5b4" (UID: "59bdafe7-9c43-4acc-a212-864bdf38d5b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.519117 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.522792 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j" (OuterVolumeSpecName: "kube-api-access-68d6j") pod "59bdafe7-9c43-4acc-a212-864bdf38d5b4" (UID: "59bdafe7-9c43-4acc-a212-864bdf38d5b4"). InnerVolumeSpecName "kube-api-access-68d6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.534636 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-gm2pz" podUID="cddd3df9-e505-4f25-988d-8cba87eaefbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: i/o timeout" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.621282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") pod \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.621936 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") pod \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\" (UID: \"a8c46fcc-fd9b-4073-99e6-28aadcdd823e\") " Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.622418 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59bdafe7-9c43-4acc-a212-864bdf38d5b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.622438 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68d6j\" (UniqueName: \"kubernetes.io/projected/59bdafe7-9c43-4acc-a212-864bdf38d5b4-kube-api-access-68d6j\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.623227 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8c46fcc-fd9b-4073-99e6-28aadcdd823e" (UID: "a8c46fcc-fd9b-4073-99e6-28aadcdd823e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.628070 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp" (OuterVolumeSpecName: "kube-api-access-rwcdp") pod "a8c46fcc-fd9b-4073-99e6-28aadcdd823e" (UID: "a8c46fcc-fd9b-4073-99e6-28aadcdd823e"). InnerVolumeSpecName "kube-api-access-rwcdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.724928 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.725153 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwcdp\" (UniqueName: \"kubernetes.io/projected/a8c46fcc-fd9b-4073-99e6-28aadcdd823e-kube-api-access-rwcdp\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:39 crc kubenswrapper[4898]: I0313 14:19:39.846268 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.085247 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" exitCode=0 Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.085325 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.090097 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-cdnq7" event={"ID":"59bdafe7-9c43-4acc-a212-864bdf38d5b4","Type":"ContainerDied","Data":"341d71b9d52042a949c7679712659dfd1f85d596c533f795b07d24d74bb3c431"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.090132 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="341d71b9d52042a949c7679712659dfd1f85d596c533f795b07d24d74bb3c431" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.090192 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-cdnq7" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.095469 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerStarted","Data":"9e25ca1915d093420431c75152fe45db09d916d8afabcd6133622b3bfdcf8934"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.099908 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-621e-account-create-update-dksd9" event={"ID":"a8c46fcc-fd9b-4073-99e6-28aadcdd823e","Type":"ContainerDied","Data":"00edd84a4cb86280e20ee727db08d5b81851d9132863c79d667d12f0d3c65999"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.099939 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00edd84a4cb86280e20ee727db08d5b81851d9132863c79d667d12f0d3c65999" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.099982 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-621e-account-create-update-dksd9" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.103118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerStarted","Data":"33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.103323 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerStarted","Data":"fd546e50d07fdb189d91bb5d0791e846219e0ffc737e0423be745421015de34e"} Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.136848 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-ztbp9" podStartSLOduration=1.952848254 podStartE2EDuration="6.136830408s" podCreationTimestamp="2026-03-13 14:19:34 +0000 UTC" firstStartedPulling="2026-03-13 14:19:35.16066316 +0000 UTC m=+1410.162251399" lastFinishedPulling="2026-03-13 14:19:39.344645294 +0000 UTC m=+1414.346233553" observedRunningTime="2026-03-13 14:19:40.127920396 +0000 UTC m=+1415.129508635" watchObservedRunningTime="2026-03-13 14:19:40.136830408 +0000 UTC m=+1415.138418637" Mar 13 14:19:40 crc kubenswrapper[4898]: I0313 14:19:40.146398 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-55n8q" podStartSLOduration=4.146382046 podStartE2EDuration="4.146382046s" podCreationTimestamp="2026-03-13 14:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:40.140218766 +0000 UTC m=+1415.141807005" watchObservedRunningTime="2026-03-13 14:19:40.146382046 +0000 UTC m=+1415.147970285" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.114813 4898 generic.go:334] "Generic (PLEG): container finished" podID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerID="33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c" exitCode=0 Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.115981 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerDied","Data":"33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c"} Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.172795 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.173628 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerName="mariadb-database-create" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.173655 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerName="mariadb-database-create" Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.173733 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerName="mariadb-account-create-update" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.173745 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerName="mariadb-account-create-update" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.174227 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" containerName="mariadb-database-create" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.174262 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" containerName="mariadb-account-create-update" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.175352 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.178004 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.253973 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.255335 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.255332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.255491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.258201 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.262650 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.355750 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357019 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357074 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357223 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357258 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357337 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.357890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.366129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.396451 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"keystone-db-create-ppqg7\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.456036 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.457551 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459402 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459444 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.459545 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.460383 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.461851 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.482031 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.483135 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"keystone-d2b7-account-create-update-ggzw8\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.499539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561395 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561700 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.561783 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.562013 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.562036 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:41 crc kubenswrapper[4898]: E0313 14:19:41.562082 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:19:49.562065218 +0000 UTC m=+1424.563653457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.563083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.577587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.586418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"placement-db-create-zzflk\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.664063 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.664132 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.665808 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.683617 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.689924 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"placement-35b4-account-create-update-7rdfs\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:41 crc kubenswrapper[4898]: I0313 14:19:41.700091 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.036754 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.132535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ppqg7" event={"ID":"bc61df36-ac68-4cf0-9456-140bccb5435c","Type":"ContainerStarted","Data":"1ce06125124c6bb74ef67a1bb1a6e818c24f60300a2a80ac7fc268e205f2e9db"} Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.171447 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.295736 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.408211 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:19:42 crc kubenswrapper[4898]: W0313 14:19:42.424935 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bf799d3_e4d4_439d_b3da_d5467064f6f1.slice/crio-25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0 WatchSource:0}: Error finding container 25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0: Status 404 returned error can't find the container with id 25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0 Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.736711 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.738873 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.750708 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.768613 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790336 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") pod \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790496 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") pod \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\" (UID: \"91512eed-d544-4f70-b8ba-eda9f6b1bfef\") " Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790911 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.790961 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.791102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91512eed-d544-4f70-b8ba-eda9f6b1bfef" (UID: "91512eed-d544-4f70-b8ba-eda9f6b1bfef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.802146 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp" (OuterVolumeSpecName: "kube-api-access-z8dnp") pod "91512eed-d544-4f70-b8ba-eda9f6b1bfef" (UID: "91512eed-d544-4f70-b8ba-eda9f6b1bfef"). InnerVolumeSpecName "kube-api-access-z8dnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895336 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895457 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91512eed-d544-4f70-b8ba-eda9f6b1bfef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.895470 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8dnp\" (UniqueName: \"kubernetes.io/projected/91512eed-d544-4f70-b8ba-eda9f6b1bfef-kube-api-access-z8dnp\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.896219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.915793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"mysqld-exporter-openstack-cell1-db-create-n82b8\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.932348 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:19:42 crc kubenswrapper[4898]: E0313 14:19:42.937495 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerName="mariadb-account-create-update" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.937528 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerName="mariadb-account-create-update" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.937886 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" containerName="mariadb-account-create-update" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.941210 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.945324 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.961357 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.967844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.998315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:42 crc kubenswrapper[4898]: I0313 14:19:42.998497 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.018759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.028023 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.028388 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" containerID="cri-o://47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" gracePeriod=10 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.100961 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.103207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.104671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.126570 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"mysqld-exporter-ce03-account-create-update-425qr\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.165343 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-55n8q" event={"ID":"91512eed-d544-4f70-b8ba-eda9f6b1bfef","Type":"ContainerDied","Data":"fd546e50d07fdb189d91bb5d0791e846219e0ffc737e0423be745421015de34e"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.166401 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd546e50d07fdb189d91bb5d0791e846219e0ffc737e0423be745421015de34e" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.166039 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-55n8q" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.188016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerStarted","Data":"df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.188074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerStarted","Data":"25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.225062 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-35b4-account-create-update-7rdfs" podStartSLOduration=2.225041311 podStartE2EDuration="2.225041311s" podCreationTimestamp="2026-03-13 14:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:43.224141048 +0000 UTC m=+1418.225729287" watchObservedRunningTime="2026-03-13 14:19:43.225041311 +0000 UTC m=+1418.226629550" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.225270 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerID="319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.225365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerDied","Data":"319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.236640 4898 generic.go:334] "Generic (PLEG): container finished" podID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerID="e22a5e1114b923439d9f143c3b64b35dbca324bd3cf616ce49fd0caf5c66d873" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.236735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ppqg7" event={"ID":"bc61df36-ac68-4cf0-9456-140bccb5435c","Type":"ContainerDied","Data":"e22a5e1114b923439d9f143c3b64b35dbca324bd3cf616ce49fd0caf5c66d873"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.244561 4898 generic.go:334] "Generic (PLEG): container finished" podID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerID="592e2145b8382848d105acb5e5275b8dd688df9ab9d3d5caa34a389dd2742086" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.244676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b7-account-create-update-ggzw8" event={"ID":"32a060a9-dd52-4192-bc48-b9ea7a918458","Type":"ContainerDied","Data":"592e2145b8382848d105acb5e5275b8dd688df9ab9d3d5caa34a389dd2742086"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.244880 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b7-account-create-update-ggzw8" event={"ID":"32a060a9-dd52-4192-bc48-b9ea7a918458","Type":"ContainerStarted","Data":"2ae64cef16b6acabaca5f7ce48033aaffd92aad88a639e6b6b3fcc38fb562bda"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.252243 4898 generic.go:334] "Generic (PLEG): container finished" podID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerID="6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.252475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerDied","Data":"6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.288289 4898 generic.go:334] "Generic (PLEG): container finished" podID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerID="cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9" exitCode=0 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.288364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerDied","Data":"cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.303179 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerStarted","Data":"f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.303233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerStarted","Data":"639177af0c66d1dccc94daa2cc6a580d4f6472dd79a9bbc6e239f21d1c8817da"} Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.340424 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.538566 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-zzflk" podStartSLOduration=2.538546536 podStartE2EDuration="2.538546536s" podCreationTimestamp="2026-03-13 14:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:43.443617567 +0000 UTC m=+1418.445205806" watchObservedRunningTime="2026-03-13 14:19:43.538546536 +0000 UTC m=+1418.540134775" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.609456 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.660405 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-55n8q"] Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.778248 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91512eed-d544-4f70-b8ba-eda9f6b1bfef" path="/var/lib/kubelet/pods/91512eed-d544-4f70-b8ba-eda9f6b1bfef/volumes" Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.830853 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:19:43 crc kubenswrapper[4898]: W0313 14:19:43.896454 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba5ed93a_91b4_4942_a32c_ab02a536e3d4.slice/crio-5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648 WatchSource:0}: Error finding container 5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648: Status 404 returned error can't find the container with id 5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648 Mar 13 14:19:43 crc kubenswrapper[4898]: I0313 14:19:43.982045 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.152863 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.152923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.153025 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.153210 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.153243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") pod \"00d79476-a8c0-4bad-81ae-6b50afea8601\" (UID: \"00d79476-a8c0-4bad-81ae-6b50afea8601\") " Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.159472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f" (OuterVolumeSpecName: "kube-api-access-9dz9f") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "kube-api-access-9dz9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.228790 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.234319 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.242747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config" (OuterVolumeSpecName: "config") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.243863 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00d79476-a8c0-4bad-81ae-6b50afea8601" (UID: "00d79476-a8c0-4bad-81ae-6b50afea8601"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255123 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255157 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255169 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dz9f\" (UniqueName: \"kubernetes.io/projected/00d79476-a8c0-4bad-81ae-6b50afea8601-kube-api-access-9dz9f\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255178 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.255187 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00d79476-a8c0-4bad-81ae-6b50afea8601-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.262014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:19:44 crc kubenswrapper[4898]: W0313 14:19:44.268198 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod586ccc66_1989_46e5_98ad_b70c7e88e6bc.slice/crio-b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307 WatchSource:0}: Error finding container b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307: Status 404 returned error can't find the container with id b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.316594 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerID="8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.316666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerDied","Data":"8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.332763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" event={"ID":"586ccc66-1989-46e5-98ad-b70c7e88e6bc","Type":"ContainerStarted","Data":"b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.336017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerStarted","Data":"b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.336065 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerStarted","Data":"5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.349419 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerStarted","Data":"fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.350278 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.352955 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerStarted","Data":"d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.353611 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.357822 4898 generic.go:334] "Generic (PLEG): container finished" podID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerID="f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.357876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerDied","Data":"f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372688 4898 generic.go:334] "Generic (PLEG): container finished" podID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372798 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerDied","Data":"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" event={"ID":"00d79476-a8c0-4bad-81ae-6b50afea8601","Type":"ContainerDied","Data":"feb144c4ab58e7daafcb4fa68f1d3e330823c649be75d07c3fb9eb354883c85d"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.372846 4898 scope.go:117] "RemoveContainer" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.373005 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-x9tcr" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.389410 4898 generic.go:334] "Generic (PLEG): container finished" podID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerID="df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3" exitCode=0 Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.389493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerDied","Data":"df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.410358 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" podStartSLOduration=2.410333421 podStartE2EDuration="2.410333421s" podCreationTimestamp="2026-03-13 14:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:44.404311484 +0000 UTC m=+1419.405899723" watchObservedRunningTime="2026-03-13 14:19:44.410333421 +0000 UTC m=+1419.411921660" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.421081 4898 scope.go:117] "RemoveContainer" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.437340 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerStarted","Data":"122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4"} Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.437533 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.476488 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.112626882 podStartE2EDuration="59.476465541s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.691946578 +0000 UTC m=+1367.693534817" lastFinishedPulling="2026-03-13 14:19:09.055785237 +0000 UTC m=+1384.057373476" observedRunningTime="2026-03-13 14:19:44.461447971 +0000 UTC m=+1419.463036220" watchObservedRunningTime="2026-03-13 14:19:44.476465541 +0000 UTC m=+1419.478053780" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.523502 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=43.269996896 podStartE2EDuration="59.523484694s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.691355543 +0000 UTC m=+1367.692943782" lastFinishedPulling="2026-03-13 14:19:08.944843331 +0000 UTC m=+1383.946431580" observedRunningTime="2026-03-13 14:19:44.512434177 +0000 UTC m=+1419.514022426" watchObservedRunningTime="2026-03-13 14:19:44.523484694 +0000 UTC m=+1419.525072933" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.525987 4898 scope.go:117] "RemoveContainer" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" Mar 13 14:19:44 crc kubenswrapper[4898]: E0313 14:19:44.527129 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d\": container with ID starting with 47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d not found: ID does not exist" containerID="47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.527162 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d"} err="failed to get container status \"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d\": rpc error: code = NotFound desc = could not find container \"47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d\": container with ID starting with 47f7b7782ee0db0b141a3f7ddac8cf1c7a3089fed6ebef010a3382327d07522d not found: ID does not exist" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.527186 4898 scope.go:117] "RemoveContainer" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" Mar 13 14:19:44 crc kubenswrapper[4898]: E0313 14:19:44.528059 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be\": container with ID starting with 994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be not found: ID does not exist" containerID="994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.528091 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be"} err="failed to get container status \"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be\": rpc error: code = NotFound desc = could not find container \"994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be\": container with ID starting with 994a2410c9adb8d8fd6c4f99ea789f0c24f7ab61b1131418ac256a7ab707b9be not found: ID does not exist" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.662338 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.706640 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-x9tcr"] Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.712060 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=43.431047135 podStartE2EDuration="59.712032508s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.743451858 +0000 UTC m=+1367.745040097" lastFinishedPulling="2026-03-13 14:19:09.024437231 +0000 UTC m=+1384.026025470" observedRunningTime="2026-03-13 14:19:44.622735036 +0000 UTC m=+1419.624323295" watchObservedRunningTime="2026-03-13 14:19:44.712032508 +0000 UTC m=+1419.713620757" Mar 13 14:19:44 crc kubenswrapper[4898]: I0313 14:19:44.896547 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.129714 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.146465 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") pod \"bc61df36-ac68-4cf0-9456-140bccb5435c\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206478 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") pod \"32a060a9-dd52-4192-bc48-b9ea7a918458\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206512 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") pod \"32a060a9-dd52-4192-bc48-b9ea7a918458\" (UID: \"32a060a9-dd52-4192-bc48-b9ea7a918458\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206551 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") pod \"bc61df36-ac68-4cf0-9456-140bccb5435c\" (UID: \"bc61df36-ac68-4cf0-9456-140bccb5435c\") " Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.206863 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bc61df36-ac68-4cf0-9456-140bccb5435c" (UID: "bc61df36-ac68-4cf0-9456-140bccb5435c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.207211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32a060a9-dd52-4192-bc48-b9ea7a918458" (UID: "32a060a9-dd52-4192-bc48-b9ea7a918458"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.207324 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bc61df36-ac68-4cf0-9456-140bccb5435c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.213716 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf" (OuterVolumeSpecName: "kube-api-access-wrszf") pod "32a060a9-dd52-4192-bc48-b9ea7a918458" (UID: "32a060a9-dd52-4192-bc48-b9ea7a918458"). InnerVolumeSpecName "kube-api-access-wrszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.216308 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz" (OuterVolumeSpecName: "kube-api-access-5vlfz") pod "bc61df36-ac68-4cf0-9456-140bccb5435c" (UID: "bc61df36-ac68-4cf0-9456-140bccb5435c"). InnerVolumeSpecName "kube-api-access-5vlfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.309799 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrszf\" (UniqueName: \"kubernetes.io/projected/32a060a9-dd52-4192-bc48-b9ea7a918458-kube-api-access-wrszf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.309834 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a060a9-dd52-4192-bc48-b9ea7a918458-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.309844 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vlfz\" (UniqueName: \"kubernetes.io/projected/bc61df36-ac68-4cf0-9456-140bccb5435c-kube-api-access-5vlfz\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.448949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ppqg7" event={"ID":"bc61df36-ac68-4cf0-9456-140bccb5435c","Type":"ContainerDied","Data":"1ce06125124c6bb74ef67a1bb1a6e818c24f60300a2a80ac7fc268e205f2e9db"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.448988 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ce06125124c6bb74ef67a1bb1a6e818c24f60300a2a80ac7fc268e205f2e9db" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.449036 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ppqg7" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.454348 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b7-account-create-update-ggzw8" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.454369 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b7-account-create-update-ggzw8" event={"ID":"32a060a9-dd52-4192-bc48-b9ea7a918458","Type":"ContainerDied","Data":"2ae64cef16b6acabaca5f7ce48033aaffd92aad88a639e6b6b3fcc38fb562bda"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.454406 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ae64cef16b6acabaca5f7ce48033aaffd92aad88a639e6b6b3fcc38fb562bda" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.458438 4898 generic.go:334] "Generic (PLEG): container finished" podID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerID="66d662834083b3a8826084dc54618cf384de1f9336d6d06012f43689e8e15545" exitCode=0 Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.458492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" event={"ID":"586ccc66-1989-46e5-98ad-b70c7e88e6bc","Type":"ContainerDied","Data":"66d662834083b3a8826084dc54618cf384de1f9336d6d06012f43689e8e15545"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.464540 4898 generic.go:334] "Generic (PLEG): container finished" podID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerID="b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f" exitCode=0 Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.464644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerDied","Data":"b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.471000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerStarted","Data":"1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655"} Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.555737 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.194072891 podStartE2EDuration="1m0.555718863s" podCreationTimestamp="2026-03-13 14:18:45 +0000 UTC" firstStartedPulling="2026-03-13 14:18:52.67049546 +0000 UTC m=+1367.672083699" lastFinishedPulling="2026-03-13 14:19:09.032141432 +0000 UTC m=+1384.033729671" observedRunningTime="2026-03-13 14:19:45.544414459 +0000 UTC m=+1420.546002698" watchObservedRunningTime="2026-03-13 14:19:45.555718863 +0000 UTC m=+1420.557307102" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.777722 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" path="/var/lib/kubelet/pods/00d79476-a8c0-4bad-81ae-6b50afea8601/volumes" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.790476 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.790977 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerName="mariadb-account-create-update" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.790993 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerName="mariadb-account-create-update" Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.791002 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerName="mariadb-database-create" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791008 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerName="mariadb-database-create" Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.791027 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791033 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" Mar 13 14:19:45 crc kubenswrapper[4898]: E0313 14:19:45.791049 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="init" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791055 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="init" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791254 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" containerName="mariadb-database-create" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791273 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="00d79476-a8c0-4bad-81ae-6b50afea8601" containerName="dnsmasq-dns" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791284 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" containerName="mariadb-account-create-update" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.791995 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.795226 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.795414 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t4f8x" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.806441 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927138 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927218 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927351 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:45 crc kubenswrapper[4898]: I0313 14:19:45.927410 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029116 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.029631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.034483 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.036047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.046510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.160586 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"glance-db-sync-x75zk\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.279598 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.412531 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.438081 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") pod \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.438332 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") pod \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\" (UID: \"0bf799d3-e4d4-439d-b3da-d5467064f6f1\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.438927 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bf799d3-e4d4-439d-b3da-d5467064f6f1" (UID: "0bf799d3-e4d4-439d-b3da-d5467064f6f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.442432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8" (OuterVolumeSpecName: "kube-api-access-ksxl8") pod "0bf799d3-e4d4-439d-b3da-d5467064f6f1" (UID: "0bf799d3-e4d4-439d-b3da-d5467064f6f1"). InnerVolumeSpecName "kube-api-access-ksxl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.499557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-35b4-account-create-update-7rdfs" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.499752 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-35b4-account-create-update-7rdfs" event={"ID":"0bf799d3-e4d4-439d-b3da-d5467064f6f1","Type":"ContainerDied","Data":"25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0"} Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.500065 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25447370a3f4e28f8e8ccf8af1d7f9221a24319685b5d3199e4478e305eb1ec0" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.540270 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bf799d3-e4d4-439d-b3da-d5467064f6f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.540494 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksxl8\" (UniqueName: \"kubernetes.io/projected/0bf799d3-e4d4-439d-b3da-d5467064f6f1-kube-api-access-ksxl8\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.708379 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.853820 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") pod \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.854322 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f58c984f-f43f-42dc-90a5-aebbe79a47a5" (UID: "f58c984f-f43f-42dc-90a5-aebbe79a47a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.854467 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") pod \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\" (UID: \"f58c984f-f43f-42dc-90a5-aebbe79a47a5\") " Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.856939 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f58c984f-f43f-42dc-90a5-aebbe79a47a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.858562 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f" (OuterVolumeSpecName: "kube-api-access-9jv5f") pod "f58c984f-f43f-42dc-90a5-aebbe79a47a5" (UID: "f58c984f-f43f-42dc-90a5-aebbe79a47a5"). InnerVolumeSpecName "kube-api-access-9jv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:46 crc kubenswrapper[4898]: I0313 14:19:46.959429 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jv5f\" (UniqueName: \"kubernetes.io/projected/f58c984f-f43f-42dc-90a5-aebbe79a47a5-kube-api-access-9jv5f\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.063748 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:47 crc kubenswrapper[4898]: E0313 14:19:47.064435 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerName="mariadb-account-create-update" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064448 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerName="mariadb-account-create-update" Mar 13 14:19:47 crc kubenswrapper[4898]: E0313 14:19:47.064460 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerName="mariadb-database-create" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064465 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerName="mariadb-database-create" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064659 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" containerName="mariadb-account-create-update" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.064680 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" containerName="mariadb-database-create" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.065396 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.070271 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.084409 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.101111 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.122624 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.129496 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.165307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.165353 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.266981 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") pod \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.267032 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") pod \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.267173 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") pod \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\" (UID: \"ba5ed93a-91b4-4942-a32c-ab02a536e3d4\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.267290 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") pod \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\" (UID: \"586ccc66-1989-46e5-98ad-b70c7e88e6bc\") " Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268009 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "586ccc66-1989-46e5-98ad-b70c7e88e6bc" (UID: "586ccc66-1989-46e5-98ad-b70c7e88e6bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268080 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba5ed93a-91b4-4942-a32c-ab02a536e3d4" (UID: "ba5ed93a-91b4-4942-a32c-ab02a536e3d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268477 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.268881 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.269403 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.269418 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.269821 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/586ccc66-1989-46e5-98ad-b70c7e88e6bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.270368 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj" (OuterVolumeSpecName: "kube-api-access-h2zbj") pod "586ccc66-1989-46e5-98ad-b70c7e88e6bc" (UID: "586ccc66-1989-46e5-98ad-b70c7e88e6bc"). InnerVolumeSpecName "kube-api-access-h2zbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.272219 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm" (OuterVolumeSpecName: "kube-api-access-zrtkm") pod "ba5ed93a-91b4-4942-a32c-ab02a536e3d4" (UID: "ba5ed93a-91b4-4942-a32c-ab02a536e3d4"). InnerVolumeSpecName "kube-api-access-zrtkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.287747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"root-account-create-update-kwhth\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.371365 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2zbj\" (UniqueName: \"kubernetes.io/projected/586ccc66-1989-46e5-98ad-b70c7e88e6bc-kube-api-access-h2zbj\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.371398 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrtkm\" (UniqueName: \"kubernetes.io/projected/ba5ed93a-91b4-4942-a32c-ab02a536e3d4-kube-api-access-zrtkm\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.397554 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.517928 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.522272 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-zzflk" event={"ID":"f58c984f-f43f-42dc-90a5-aebbe79a47a5","Type":"ContainerDied","Data":"639177af0c66d1dccc94daa2cc6a580d4f6472dd79a9bbc6e239f21d1c8817da"} Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.522318 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-zzflk" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.522334 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639177af0c66d1dccc94daa2cc6a580d4f6472dd79a9bbc6e239f21d1c8817da" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.529373 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" event={"ID":"586ccc66-1989-46e5-98ad-b70c7e88e6bc","Type":"ContainerDied","Data":"b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307"} Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.529407 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b38fcfa140b37179564ee614932bc1713c8394aa057a914d2d0914aa1268a307" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.529520 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-ce03-account-create-update-425qr" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.542542 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" event={"ID":"ba5ed93a-91b4-4942-a32c-ab02a536e3d4","Type":"ContainerDied","Data":"5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648"} Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.542650 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ca508ecb94469968a2e06dd2dc51d5ad3bb989409259ece9ea2999f68912648" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.542770 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-n82b8" Mar 13 14:19:47 crc kubenswrapper[4898]: I0313 14:19:47.959819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.557995 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerStarted","Data":"ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e"} Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.558344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerStarted","Data":"44806f086dd8f53a481f443ac6d3d5090de65c2b1eec8674ee6ce1b1966c8da0"} Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.563535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerStarted","Data":"cfd74a4d5e0c74c0810797d2427e3c0958d75483bdf99da9d6e8e1c6bac07623"} Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.581983 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-kwhth" podStartSLOduration=1.581965044 podStartE2EDuration="1.581965044s" podCreationTimestamp="2026-03-13 14:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:19:48.57410522 +0000 UTC m=+1423.575693459" watchObservedRunningTime="2026-03-13 14:19:48.581965044 +0000 UTC m=+1423.583553283" Mar 13 14:19:48 crc kubenswrapper[4898]: I0313 14:19:48.926849 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6ddbb5776b-mx8sz" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" containerID="cri-o://5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc" gracePeriod=15 Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.619136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:19:49 crc kubenswrapper[4898]: E0313 14:19:49.619578 4898 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 14:19:49 crc kubenswrapper[4898]: E0313 14:19:49.619592 4898 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 14:19:49 crc kubenswrapper[4898]: E0313 14:19:49.619638 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift podName:794bd82b-e289-4b31-b0cf-f1285452e783 nodeName:}" failed. No retries permitted until 2026-03-13 14:20:05.619621624 +0000 UTC m=+1440.621209863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift") pod "swift-storage-0" (UID: "794bd82b-e289-4b31-b0cf-f1285452e783") : configmap "swift-ring-files" not found Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.677972 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ddbb5776b-mx8sz_8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/console/0.log" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.678042 4898 generic.go:334] "Generic (PLEG): container finished" podID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerID="5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc" exitCode=2 Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.678460 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerDied","Data":"5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc"} Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.770339 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ddbb5776b-mx8sz_8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/console/0.log" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.770443 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928701 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928872 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928917 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928961 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.928994 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929467 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929526 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929839 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config" (OuterVolumeSpecName: "console-config") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.929992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") pod \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\" (UID: \"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822\") " Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930424 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930510 4898 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930534 4898 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930603 4898 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.930615 4898 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.934989 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr" (OuterVolumeSpecName: "kube-api-access-98jnr") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "kube-api-access-98jnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.935321 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:49 crc kubenswrapper[4898]: I0313 14:19:49.948070 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" (UID: "8f6fd2de-efa6-4d17-aa5e-4f44ced1f822"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.032193 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98jnr\" (UniqueName: \"kubernetes.io/projected/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-kube-api-access-98jnr\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.032234 4898 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.032248 4898 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.233627 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j79bj" podUID="a506ef1a-354a-49c8-b63d-4db4b9ecdcfe" containerName="ovn-controller" probeResult="failure" output=< Mar 13 14:19:50 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 14:19:50 crc kubenswrapper[4898]: > Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.313303 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.691716 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ddbb5776b-mx8sz_8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/console/0.log" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.691852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ddbb5776b-mx8sz" event={"ID":"8f6fd2de-efa6-4d17-aa5e-4f44ced1f822","Type":"ContainerDied","Data":"25f0f90d86df62bf31d201c4dfd7dca1ef0e3998bd4fd076756d1a9449231afa"} Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.691922 4898 scope.go:117] "RemoveContainer" containerID="5fefdf0ffc03648b76b936160e7ddb5fac4056b104b304ca9509ff6217a2c4fc" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.692129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ddbb5776b-mx8sz" Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.694494 4898 generic.go:334] "Generic (PLEG): container finished" podID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerID="9e25ca1915d093420431c75152fe45db09d916d8afabcd6133622b3bfdcf8934" exitCode=0 Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.694552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerDied","Data":"9e25ca1915d093420431c75152fe45db09d916d8afabcd6133622b3bfdcf8934"} Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.695838 4898 generic.go:334] "Generic (PLEG): container finished" podID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerID="ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e" exitCode=0 Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.695871 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerDied","Data":"ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e"} Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.771418 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:19:50 crc kubenswrapper[4898]: I0313 14:19:50.785809 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6ddbb5776b-mx8sz"] Mar 13 14:19:51 crc kubenswrapper[4898]: I0313 14:19:51.755420 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" path="/var/lib/kubelet/pods/8f6fd2de-efa6-4d17-aa5e-4f44ced1f822/volumes" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046048 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:19:53 crc kubenswrapper[4898]: E0313 14:19:53.046779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerName="mariadb-account-create-update" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046798 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerName="mariadb-account-create-update" Mar 13 14:19:53 crc kubenswrapper[4898]: E0313 14:19:53.046840 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046848 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" Mar 13 14:19:53 crc kubenswrapper[4898]: E0313 14:19:53.046862 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerName="mariadb-database-create" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.046871 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerName="mariadb-database-create" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.047364 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f6fd2de-efa6-4d17-aa5e-4f44ced1f822" containerName="console" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.047387 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" containerName="mariadb-database-create" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.047416 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" containerName="mariadb-account-create-update" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.048138 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.050233 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.067090 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.103480 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.103581 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.103715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.206009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.206163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.206237 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.214930 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.215368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.224219 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"mysqld-exporter-0\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:19:53 crc kubenswrapper[4898]: I0313 14:19:53.371052 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.212158 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j79bj" podUID="a506ef1a-354a-49c8-b63d-4db4b9ecdcfe" containerName="ovn-controller" probeResult="failure" output=< Mar 13 14:19:55 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 14:19:55 crc kubenswrapper[4898]: > Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.269052 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-r9tmf" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.476118 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.477695 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.479528 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.493851 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556190 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556530 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556614 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.556687 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658571 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658684 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.658767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.659345 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660168 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660252 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.660623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.696636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"ovn-controller-j79bj-config-2f2rl\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:55 crc kubenswrapper[4898]: I0313 14:19:55.802070 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.008009 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.014913 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.065689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066220 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") pod \"f555bcf8-c516-44eb-aaf3-446734ea39c2\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066254 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") pod \"f555bcf8-c516-44eb-aaf3-446734ea39c2\" (UID: \"f555bcf8-c516-44eb-aaf3-446734ea39c2\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066317 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066362 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066475 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.066516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.067681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.068059 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f555bcf8-c516-44eb-aaf3-446734ea39c2" (UID: "f555bcf8-c516-44eb-aaf3-446734ea39c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.077595 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.077761 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9" (OuterVolumeSpecName: "kube-api-access-gmzv9") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "kube-api-access-gmzv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.082126 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7" (OuterVolumeSpecName: "kube-api-access-q2tg7") pod "f555bcf8-c516-44eb-aaf3-446734ea39c2" (UID: "f555bcf8-c516-44eb-aaf3-446734ea39c2"). InnerVolumeSpecName "kube-api-access-q2tg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.088031 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.099829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts" (OuterVolumeSpecName: "scripts") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: E0313 14:19:56.120056 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf podName:4a6f0bfb-5db5-440c-a93f-0d6fe159401d nodeName:}" failed. No retries permitted until 2026-03-13 14:19:56.620026378 +0000 UTC m=+1431.621614617 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d") : error deleting /var/lib/kubelet/pods/4a6f0bfb-5db5-440c-a93f-0d6fe159401d/volume-subpaths: remove /var/lib/kubelet/pods/4a6f0bfb-5db5-440c-a93f-0d6fe159401d/volume-subpaths: no such file or directory Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.122247 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170878 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170921 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170934 4898 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170945 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmzv9\" (UniqueName: \"kubernetes.io/projected/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-kube-api-access-gmzv9\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170956 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f555bcf8-c516-44eb-aaf3-446734ea39c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170965 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2tg7\" (UniqueName: \"kubernetes.io/projected/f555bcf8-c516-44eb-aaf3-446734ea39c2-kube-api-access-q2tg7\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170973 4898 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.170981 4898 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.680004 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") pod \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\" (UID: \"4a6f0bfb-5db5-440c-a93f-0d6fe159401d\") " Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.685165 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4a6f0bfb-5db5-440c-a93f-0d6fe159401d" (UID: "4a6f0bfb-5db5-440c-a93f-0d6fe159401d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.784996 4898 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4a6f0bfb-5db5-440c-a93f-0d6fe159401d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.792097 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-kwhth" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.792100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-kwhth" event={"ID":"f555bcf8-c516-44eb-aaf3-446734ea39c2","Type":"ContainerDied","Data":"44806f086dd8f53a481f443ac6d3d5090de65c2b1eec8674ee6ce1b1966c8da0"} Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.792234 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44806f086dd8f53a481f443ac6d3d5090de65c2b1eec8674ee6ce1b1966c8da0" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.793821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ztbp9" event={"ID":"4a6f0bfb-5db5-440c-a93f-0d6fe159401d","Type":"ContainerDied","Data":"7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce"} Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.793858 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e54bc8fd0c558fc45ec5afd29c72815086cd68050f8ecbef97610bde93e64ce" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.793883 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ztbp9" Mar 13 14:19:56 crc kubenswrapper[4898]: I0313 14:19:56.893497 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 13 14:19:57 crc kubenswrapper[4898]: I0313 14:19:57.133308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 13 14:19:57 crc kubenswrapper[4898]: I0313 14:19:57.213147 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 13 14:19:57 crc kubenswrapper[4898]: I0313 14:19:57.274038 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 13 14:19:58 crc kubenswrapper[4898]: I0313 14:19:58.603350 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:58 crc kubenswrapper[4898]: I0313 14:19:58.615328 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-kwhth"] Mar 13 14:19:59 crc kubenswrapper[4898]: I0313 14:19:59.755505 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" path="/var/lib/kubelet/pods/f555bcf8-c516-44eb-aaf3-446734ea39c2/volumes" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.139415 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:20:00 crc kubenswrapper[4898]: E0313 14:20:00.140448 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerName="mariadb-account-create-update" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140467 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerName="mariadb-account-create-update" Mar 13 14:20:00 crc kubenswrapper[4898]: E0313 14:20:00.140501 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerName="swift-ring-rebalance" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140508 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerName="swift-ring-rebalance" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140719 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6f0bfb-5db5-440c-a93f-0d6fe159401d" containerName="swift-ring-rebalance" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.140750 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f555bcf8-c516-44eb-aaf3-446734ea39c2" containerName="mariadb-account-create-update" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.141472 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.145381 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.145519 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.146531 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.155690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.208056 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-j79bj" podUID="a506ef1a-354a-49c8-b63d-4db4b9ecdcfe" containerName="ovn-controller" probeResult="failure" output=< Mar 13 14:20:00 crc kubenswrapper[4898]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 14:20:00 crc kubenswrapper[4898]: > Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.302038 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"auto-csr-approver-29556860-xbwgj\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.404856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"auto-csr-approver-29556860-xbwgj\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.423288 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"auto-csr-approver-29556860-xbwgj\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:00 crc kubenswrapper[4898]: I0313 14:20:00.470096 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.063287 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.065575 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.068265 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.074360 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.144919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.145246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.247589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.247862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.248810 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.267945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"root-account-create-update-vswd4\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:02 crc kubenswrapper[4898]: I0313 14:20:02.394175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.355344 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.391876 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.559528 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.569180 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.888243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerStarted","Data":"fe553b08f29dc87c01c836389227794f6bc900596f5a85dd1ed792d64aa19876"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.892213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerStarted","Data":"41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.892253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerStarted","Data":"2969707a1c4df4f3788f85982e6db252de3869c756e5d786fc65dcca1a81e648"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.895021 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerStarted","Data":"c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.895070 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerStarted","Data":"5d888e90db0c648dc7f94b37e9185e80b994aa04738c4f7716d99c355585bcd1"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.898416 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerStarted","Data":"81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.900383 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerStarted","Data":"704384ed7c0f9c137486ec08c213a83e3fe7f2792127e2bd89444beb085393db"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.903407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.928097 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-j79bj-config-2f2rl" podStartSLOduration=8.928068519 podStartE2EDuration="8.928068519s" podCreationTimestamp="2026-03-13 14:19:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:03.911760996 +0000 UTC m=+1438.913349235" watchObservedRunningTime="2026-03-13 14:20:03.928068519 +0000 UTC m=+1438.929656778" Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.941007 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-x75zk" podStartSLOduration=3.541535182 podStartE2EDuration="18.940989505s" podCreationTimestamp="2026-03-13 14:19:45 +0000 UTC" firstStartedPulling="2026-03-13 14:19:47.537808636 +0000 UTC m=+1422.539396875" lastFinishedPulling="2026-03-13 14:20:02.937262949 +0000 UTC m=+1437.938851198" observedRunningTime="2026-03-13 14:20:03.930334778 +0000 UTC m=+1438.931923017" watchObservedRunningTime="2026-03-13 14:20:03.940989505 +0000 UTC m=+1438.942577734" Mar 13 14:20:03 crc kubenswrapper[4898]: I0313 14:20:03.972153 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-vswd4" podStartSLOduration=1.972128803 podStartE2EDuration="1.972128803s" podCreationTimestamp="2026-03-13 14:20:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:03.957652078 +0000 UTC m=+1438.959240317" watchObservedRunningTime="2026-03-13 14:20:03.972128803 +0000 UTC m=+1438.973717042" Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.917922 4898 generic.go:334] "Generic (PLEG): container finished" podID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerID="c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52" exitCode=0 Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.918504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerDied","Data":"c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52"} Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.926667 4898 generic.go:334] "Generic (PLEG): container finished" podID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerID="41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393" exitCode=0 Mar 13 14:20:04 crc kubenswrapper[4898]: I0313 14:20:04.926730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerDied","Data":"41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393"} Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.349357 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-j79bj" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.664480 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.675046 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/794bd82b-e289-4b31-b0cf-f1285452e783-etc-swift\") pod \"swift-storage-0\" (UID: \"794bd82b-e289-4b31-b0cf-f1285452e783\") " pod="openstack/swift-storage-0" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.912820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.940301 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerStarted","Data":"169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00"} Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.943363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerStarted","Data":"36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339"} Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.983703 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=11.395220488 podStartE2EDuration="12.983678442s" podCreationTimestamp="2026-03-13 14:19:53 +0000 UTC" firstStartedPulling="2026-03-13 14:20:03.365004653 +0000 UTC m=+1438.366592892" lastFinishedPulling="2026-03-13 14:20:04.953462607 +0000 UTC m=+1439.955050846" observedRunningTime="2026-03-13 14:20:05.961367472 +0000 UTC m=+1440.962955731" watchObservedRunningTime="2026-03-13 14:20:05.983678442 +0000 UTC m=+1440.985266701" Mar 13 14:20:05 crc kubenswrapper[4898]: I0313 14:20:05.993382 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" podStartSLOduration=4.46352042 podStartE2EDuration="5.993365863s" podCreationTimestamp="2026-03-13 14:20:00 +0000 UTC" firstStartedPulling="2026-03-13 14:20:03.611018239 +0000 UTC m=+1438.612606478" lastFinishedPulling="2026-03-13 14:20:05.140863682 +0000 UTC m=+1440.142451921" observedRunningTime="2026-03-13 14:20:05.983598549 +0000 UTC m=+1440.985186788" watchObservedRunningTime="2026-03-13 14:20:05.993365863 +0000 UTC m=+1440.994954102" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.529201 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.535622 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.687865 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") pod \"26c5ac36-8689-4bf3-8755-a84e22377e2a\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.687965 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688175 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") pod \"26c5ac36-8689-4bf3-8755-a84e22377e2a\" (UID: \"26c5ac36-8689-4bf3-8755-a84e22377e2a\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688234 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688267 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.688434 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") pod \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\" (UID: \"c5ddf723-16ad-425f-bae9-86adc8fe2a3d\") " Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689060 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run" (OuterVolumeSpecName: "var-run") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689123 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689624 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "26c5ac36-8689-4bf3-8755-a84e22377e2a" (UID: "26c5ac36-8689-4bf3-8755-a84e22377e2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.689844 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.690078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts" (OuterVolumeSpecName: "scripts") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.693486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.696272 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz" (OuterVolumeSpecName: "kube-api-access-lq9fz") pod "26c5ac36-8689-4bf3-8755-a84e22377e2a" (UID: "26c5ac36-8689-4bf3-8755-a84e22377e2a"). InnerVolumeSpecName "kube-api-access-lq9fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.698170 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8" (OuterVolumeSpecName: "kube-api-access-62mk8") pod "c5ddf723-16ad-425f-bae9-86adc8fe2a3d" (UID: "c5ddf723-16ad-425f-bae9-86adc8fe2a3d"). InnerVolumeSpecName "kube-api-access-62mk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790392 4898 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790431 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq9fz\" (UniqueName: \"kubernetes.io/projected/26c5ac36-8689-4bf3-8755-a84e22377e2a-kube-api-access-lq9fz\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790444 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790453 4898 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790461 4898 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790476 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62mk8\" (UniqueName: \"kubernetes.io/projected/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-kube-api-access-62mk8\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790485 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/26c5ac36-8689-4bf3-8755-a84e22377e2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.790494 4898 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c5ddf723-16ad-425f-bae9-86adc8fe2a3d-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.894374 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.986015 4898 generic.go:334] "Generic (PLEG): container finished" podID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerID="36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339" exitCode=0 Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.986112 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerDied","Data":"36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339"} Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.989287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.993371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-j79bj-config-2f2rl" event={"ID":"c5ddf723-16ad-425f-bae9-86adc8fe2a3d","Type":"ContainerDied","Data":"2969707a1c4df4f3788f85982e6db252de3869c756e5d786fc65dcca1a81e648"} Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.993413 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2969707a1c4df4f3788f85982e6db252de3869c756e5d786fc65dcca1a81e648" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.993493 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-j79bj-config-2f2rl" Mar 13 14:20:06 crc kubenswrapper[4898]: I0313 14:20:06.995851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"61fd48329f75beb01938657f89ddfb5c6fc8974bf8ca503f4cd35506daaa8d03"} Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.003984 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vswd4" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.005580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vswd4" event={"ID":"26c5ac36-8689-4bf3-8755-a84e22377e2a","Type":"ContainerDied","Data":"5d888e90db0c648dc7f94b37e9185e80b994aa04738c4f7716d99c355585bcd1"} Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.005665 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d888e90db0c648dc7f94b37e9185e80b994aa04738c4f7716d99c355585bcd1" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.129848 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.214138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.272311 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.628072 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.648096 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-j79bj-config-2f2rl"] Mar 13 14:20:07 crc kubenswrapper[4898]: I0313 14:20:07.753945 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" path="/var/lib/kubelet/pods/c5ddf723-16ad-425f-bae9-86adc8fe2a3d/volumes" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.412993 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.531006 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") pod \"02521dff-1dee-4839-ab35-a4bfa82bc405\" (UID: \"02521dff-1dee-4839-ab35-a4bfa82bc405\") " Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.537999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f" (OuterVolumeSpecName: "kube-api-access-wh72f") pod "02521dff-1dee-4839-ab35-a4bfa82bc405" (UID: "02521dff-1dee-4839-ab35-a4bfa82bc405"). InnerVolumeSpecName "kube-api-access-wh72f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.636182 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh72f\" (UniqueName: \"kubernetes.io/projected/02521dff-1dee-4839-ab35-a4bfa82bc405-kube-api-access-wh72f\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.665945 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.674440 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vswd4"] Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.872233 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:20:08 crc kubenswrapper[4898]: I0313 14:20:08.883005 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556854-6mxhz"] Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035466 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"85fbdd6a017089c207ebb2fda009421eabef1062d20993ddf2ad0ac0b4237787"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035507 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"07cc49f90a9c9b623e87b89a0b1f6c9600328ee32955d8414c03933f8ef12ee2"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035519 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"74a02d2191ed8fbd85f8c87a59e58cac67e744563a2564af19540e247a0af73c"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.035528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"aa180d09f36c6285291090ea50ab34d0371d807ce5e52914f90991a1fb1de6a1"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.037745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" event={"ID":"02521dff-1dee-4839-ab35-a4bfa82bc405","Type":"ContainerDied","Data":"704384ed7c0f9c137486ec08c213a83e3fe7f2792127e2bd89444beb085393db"} Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.037780 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704384ed7c0f9c137486ec08c213a83e3fe7f2792127e2bd89444beb085393db" Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.037832 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556860-xbwgj" Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.751843 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" path="/var/lib/kubelet/pods/26c5ac36-8689-4bf3-8755-a84e22377e2a/volumes" Mar 13 14:20:09 crc kubenswrapper[4898]: I0313 14:20:09.752766 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35372caa-772c-434c-8fb2-3b82926c1521" path="/var/lib/kubelet/pods/35372caa-772c-434c-8fb2-3b82926c1521/volumes" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.076273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerStarted","Data":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084684 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"1d1f86d9f0bfde7af6c1c4523635ef06639e513d20e8518878792757a9dd2b0a"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"bf3060e295b94781eff853dbb64a2279fcac419c2e558e1c4859725f8c17f921"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"7243cd0755637497d2fd05db7578e90dec5fef84329e018ced2d69d7fc21df87"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.084795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"ddb42c8c65fca96b5642211c30c422d299ebd8e9b93dc178ff16cefd9f8c664c"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.093300 4898 generic.go:334] "Generic (PLEG): container finished" podID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerID="81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f" exitCode=0 Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.093345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerDied","Data":"81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f"} Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.104933 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:12 crc kubenswrapper[4898]: E0313 14:20:12.105434 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerName="oc" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105461 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerName="oc" Mar 13 14:20:12 crc kubenswrapper[4898]: E0313 14:20:12.105481 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerName="ovn-config" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105491 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerName="ovn-config" Mar 13 14:20:12 crc kubenswrapper[4898]: E0313 14:20:12.105512 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerName="mariadb-account-create-update" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105519 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerName="mariadb-account-create-update" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105762 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="26c5ac36-8689-4bf3-8755-a84e22377e2a" containerName="mariadb-account-create-update" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105801 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5ddf723-16ad-425f-bae9-86adc8fe2a3d" containerName="ovn-config" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.105816 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" containerName="oc" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.106740 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.108470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.126835 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=13.186634207 podStartE2EDuration="44.126817874s" podCreationTimestamp="2026-03-13 14:19:28 +0000 UTC" firstStartedPulling="2026-03-13 14:19:40.08731106 +0000 UTC m=+1415.088899299" lastFinishedPulling="2026-03-13 14:20:11.027494727 +0000 UTC m=+1446.029082966" observedRunningTime="2026-03-13 14:20:12.110946912 +0000 UTC m=+1447.112535161" watchObservedRunningTime="2026-03-13 14:20:12.126817874 +0000 UTC m=+1447.128406113" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.127987 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.262294 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.262412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.365223 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.365418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.366665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.392153 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"root-account-create-update-85xgb\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.432927 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:12 crc kubenswrapper[4898]: I0313 14:20:12.957142 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.529536 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.583200 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.583249 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.588153 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.597834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.598031 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.598092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.598351 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") pod \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\" (UID: \"74d4aeca-15ec-4f63-87e0-20daa6f3e70f\") " Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.606102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp" (OuterVolumeSpecName: "kube-api-access-8rvbp") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "kube-api-access-8rvbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.607266 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.642961 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.668228 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data" (OuterVolumeSpecName: "config-data") pod "74d4aeca-15ec-4f63-87e0-20daa6f3e70f" (UID: "74d4aeca-15ec-4f63-87e0-20daa6f3e70f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703062 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703093 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703104 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rvbp\" (UniqueName: \"kubernetes.io/projected/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-kube-api-access-8rvbp\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:13 crc kubenswrapper[4898]: I0313 14:20:13.703114 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/74d4aeca-15ec-4f63-87e0-20daa6f3e70f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.134084 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-x75zk" event={"ID":"74d4aeca-15ec-4f63-87e0-20daa6f3e70f","Type":"ContainerDied","Data":"cfd74a4d5e0c74c0810797d2427e3c0958d75483bdf99da9d6e8e1c6bac07623"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.134392 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfd74a4d5e0c74c0810797d2427e3c0958d75483bdf99da9d6e8e1c6bac07623" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.134392 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-x75zk" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.136285 4898 generic.go:334] "Generic (PLEG): container finished" podID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerID="2dec706ec3d47e7f4d03ac7b64859e218da33cd45852b8891c58c2ae0bd97657" exitCode=0 Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.136696 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-85xgb" event={"ID":"bb2a6e17-835f-43f9-9b2b-eb5f39df5450","Type":"ContainerDied","Data":"2dec706ec3d47e7f4d03ac7b64859e218da33cd45852b8891c58c2ae0bd97657"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.136719 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-85xgb" event={"ID":"bb2a6e17-835f-43f9-9b2b-eb5f39df5450","Type":"ContainerStarted","Data":"181aa714130faecb2d84d29217d772e112b7d72b0fc2b841c35e29cd892d1122"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144266 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"aafdfdd1a601b89501743e15f3469c5ea2c342f38447f1690153d7e24bba8022"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144337 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"890e73f52dd9968314b64b76f20e6b3db9c84a625238bdd3b9a5c74ecf1248ce"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"9a63ddefb3d41c2bfcbd0ca7b3b7e0b3cff83463706daf2b7ef2a74388e1a4ae"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.144378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"28ab4729f099fc69a2484181e480e3100ae17275d794caca863b770eb3c5762e"} Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.145992 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.575491 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:14 crc kubenswrapper[4898]: E0313 14:20:14.577125 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerName="glance-db-sync" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.577151 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerName="glance-db-sync" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.577412 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" containerName="glance-db-sync" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.578817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.589135 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.674776 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.674848 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.675232 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.675525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.675558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.779503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.779590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.779691 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.780592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.780790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.780848 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.781116 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.782022 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.782712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.801913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"dnsmasq-dns-5b946c75cc-wnv9x\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:14 crc kubenswrapper[4898]: I0313 14:20:14.904565 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.185032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"28c14e620136a22405bc4f45510146b94e4e9107f878c32e3d9678b082910e5e"} Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.501622 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.761285 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.915290 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") pod \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.915585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") pod \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\" (UID: \"bb2a6e17-835f-43f9-9b2b-eb5f39df5450\") " Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.915919 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb2a6e17-835f-43f9-9b2b-eb5f39df5450" (UID: "bb2a6e17-835f-43f9-9b2b-eb5f39df5450"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.916264 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:15 crc kubenswrapper[4898]: I0313 14:20:15.920352 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr" (OuterVolumeSpecName: "kube-api-access-xkbcr") pod "bb2a6e17-835f-43f9-9b2b-eb5f39df5450" (UID: "bb2a6e17-835f-43f9-9b2b-eb5f39df5450"). InnerVolumeSpecName "kube-api-access-xkbcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.018613 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkbcr\" (UniqueName: \"kubernetes.io/projected/bb2a6e17-835f-43f9-9b2b-eb5f39df5450-kube-api-access-xkbcr\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.211129 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-85xgb" event={"ID":"bb2a6e17-835f-43f9-9b2b-eb5f39df5450","Type":"ContainerDied","Data":"181aa714130faecb2d84d29217d772e112b7d72b0fc2b841c35e29cd892d1122"} Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.211161 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-85xgb" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.211168 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="181aa714130faecb2d84d29217d772e112b7d72b0fc2b841c35e29cd892d1122" Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.212090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerStarted","Data":"bad9604b0b4948880032fa39bf17f3a7090e87a9e7998fc75541062cf197564a"} Mar 13 14:20:16 crc kubenswrapper[4898]: I0313 14:20:16.973766 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.132865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232459 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" containerID="cri-o://97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" gracePeriod=600 Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232521 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"68514ae839a261706159c6e729ae696ff8e3ed08553c87bdc766bf73a5e48096"} Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232532 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" containerID="cri-o://7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" gracePeriod=600 Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.232619 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" containerID="cri-o://409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" gracePeriod=600 Mar 13 14:20:17 crc kubenswrapper[4898]: I0313 14:20:17.276693 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.241731 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244307 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244353 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244364 4898 generic.go:334] "Generic (PLEG): container finished" podID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244458 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244470 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"1e6f6f0d-db24-4fdb-a872-ce2c527a791b","Type":"ContainerDied","Data":"0b0a8b36ebbdd732871a36ff41874dd5aa3cc4a1c6906bb4c6b8440e7e4c245d"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.244500 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.250150 4898 generic.go:334] "Generic (PLEG): container finished" podID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" exitCode=0 Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.250220 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerDied","Data":"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276194 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276343 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276397 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276418 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.276466 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.277563 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.277644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") pod \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\" (UID: \"1e6f6f0d-db24-4fdb-a872-ce2c527a791b\") " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.279698 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.280399 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"794bd82b-e289-4b31-b0cf-f1285452e783","Type":"ContainerStarted","Data":"37bfb67d8b8561314d97aa77ffcf7c283a19fa233a2ac0c014fa13378475f9db"} Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.287085 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.287391 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.287977 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.288387 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out" (OuterVolumeSpecName: "config-out") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.288548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config" (OuterVolumeSpecName: "config") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.290807 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2" (OuterVolumeSpecName: "kube-api-access-5jxl2") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "kube-api-access-5jxl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.291545 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.291749 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.314442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config" (OuterVolumeSpecName: "web-config") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.323850 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.350096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "1e6f6f0d-db24-4fdb-a872-ce2c527a791b" (UID: "1e6f6f0d-db24-4fdb-a872-ce2c527a791b"). InnerVolumeSpecName "pvc-537e992e-0c7e-4e28-8105-b535a72a793c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.379656 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.892313837 podStartE2EDuration="46.379635595s" podCreationTimestamp="2026-03-13 14:19:32 +0000 UTC" firstStartedPulling="2026-03-13 14:20:06.706367342 +0000 UTC m=+1441.707955581" lastFinishedPulling="2026-03-13 14:20:13.1936891 +0000 UTC m=+1448.195277339" observedRunningTime="2026-03-13 14:20:18.377625413 +0000 UTC m=+1453.379213652" watchObservedRunningTime="2026-03-13 14:20:18.379635595 +0000 UTC m=+1453.381223844" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.381227 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386298 4898 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-web-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386320 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386338 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jxl2\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-kube-api-access-5jxl2\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386352 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386364 4898 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386376 4898 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386388 4898 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386417 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" " Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.386433 4898 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e6f6f0d-db24-4fdb-a872-ce2c527a791b-config-out\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.419153 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.419328 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-537e992e-0c7e-4e28-8105-b535a72a793c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c") on node "crc" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.488317 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.504097 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.542268 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.542757 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.542795 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} err="failed to get container status \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.542821 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.543280 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543309 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} err="failed to get container status \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543329 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.543629 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543726 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} err="failed to get container status \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.543800 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.544169 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544204 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} err="failed to get container status \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544223 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544512 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} err="failed to get container status \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544600 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.544947 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} err="failed to get container status \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545001 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545392 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} err="failed to get container status \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545412 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545694 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} err="failed to get container status \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.545725 4898 scope.go:117] "RemoveContainer" containerID="409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546043 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278"} err="failed to get container status \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": rpc error: code = NotFound desc = could not find container \"409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278\": container with ID starting with 409bf1215c88a0899cc23bebec9f8cdea126be501e95b10215c77324e13fe278 not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546062 4898 scope.go:117] "RemoveContainer" containerID="7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546324 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c"} err="failed to get container status \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": rpc error: code = NotFound desc = could not find container \"7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c\": container with ID starting with 7e2e94efde300a128961ef88908d969db64215b6995d2b3cb2e15a3f40e8a65c not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546406 4898 scope.go:117] "RemoveContainer" containerID="97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546816 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d"} err="failed to get container status \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": rpc error: code = NotFound desc = could not find container \"97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d\": container with ID starting with 97d23bf2eed915823c76967ee540ea114f9905dd429a2e87daddfc2a1255581d not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.546845 4898 scope.go:117] "RemoveContainer" containerID="285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.547125 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b"} err="failed to get container status \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": rpc error: code = NotFound desc = could not find container \"285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b\": container with ID starting with 285742e6a5783e0185985fab65a301e652357df458cfc070967a14f0e7b5987b not found: ID does not exist" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.719494 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.766924 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767319 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="init-config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767337 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="init-config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767366 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767374 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767387 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerName="mariadb-account-create-update" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767394 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerName="mariadb-account-create-update" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767404 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767410 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: E0313 14:20:18.767426 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767433 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767601 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="prometheus" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767629 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="config-reloader" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767639 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" containerName="thanos-sidecar" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.767649 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" containerName="mariadb-account-create-update" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.768661 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.770683 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.788677 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.794973 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795078 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795116 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.795297 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.830049 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.839671 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-85xgb"] Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.896994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.897276 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898187 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898239 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898367 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.898368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:18 crc kubenswrapper[4898]: I0313 14:20:18.921841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"dnsmasq-dns-74f6bcbc87-cz6vf\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.084360 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.306939 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.311568 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerStarted","Data":"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288"} Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.311607 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" containerID="cri-o://8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" gracePeriod=10 Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.311640 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.351344 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" podStartSLOduration=5.351324979 podStartE2EDuration="5.351324979s" podCreationTimestamp="2026-03-13 14:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:19.341290389 +0000 UTC m=+1454.342878638" watchObservedRunningTime="2026-03-13 14:20:19.351324979 +0000 UTC m=+1454.352913208" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.403752 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.438624 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.474848 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.477993 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.486356 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.490399 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.491631 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g7dw2" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492172 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492330 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492420 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.492548 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.493007 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.495265 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.497925 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.608582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619597 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4p5\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-kube-api-access-hv4p5\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619766 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.619962 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620214 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620237 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.620251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730862 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730882 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730962 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.730978 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731012 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4p5\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-kube-api-access-hv4p5\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731031 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731075 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731108 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731129 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.731151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.749149 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.758288 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.759417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.759882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.760790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d555bd54-f4d5-4b06-9517-32b4fe687f4b-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.762250 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.817924 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.818929 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-config\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.819311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.831592 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.832634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d555bd54-f4d5-4b06-9517-32b4fe687f4b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.834377 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4p5\" (UniqueName: \"kubernetes.io/projected/d555bd54-f4d5-4b06-9517-32b4fe687f4b-kube-api-access-hv4p5\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.975324 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e6f6f0d-db24-4fdb-a872-ce2c527a791b" path="/var/lib/kubelet/pods/1e6f6f0d-db24-4fdb-a872-ce2c527a791b/volumes" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.976490 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb2a6e17-835f-43f9-9b2b-eb5f39df5450" path="/var/lib/kubelet/pods/bb2a6e17-835f-43f9-9b2b-eb5f39df5450/volumes" Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.998699 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:19 crc kubenswrapper[4898]: I0313 14:20:19.998964 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7123a2111c2c1fcd673a2fa4cbaef2c14fcdb159a9a269edbe99c5cdea18ee2d/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.027010 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.028459 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.071159 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.128711 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.137885 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.138005 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") pod \"49fea8fc-372c-4cf8-a710-7fff58db294d\" (UID: \"49fea8fc-372c-4cf8-a710-7fff58db294d\") " Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.138246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.138404 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217276 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:20:20 crc kubenswrapper[4898]: E0313 14:20:20.217695 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="init" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217707 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="init" Mar 13 14:20:20 crc kubenswrapper[4898]: E0313 14:20:20.217729 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217736 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.217938 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerName="dnsmasq-dns" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.218613 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.225552 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.231302 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz" (OuterVolumeSpecName: "kube-api-access-zgnbz") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "kube-api-access-zgnbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240256 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240314 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240345 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.240479 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgnbz\" (UniqueName: \"kubernetes.io/projected/49fea8fc-372c-4cf8-a710-7fff58db294d-kube-api-access-zgnbz\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.241393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.281435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"cinder-db-create-88gdv\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.308539 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.351771 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.352044 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.376621 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.382514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404357 4898 generic.go:334] "Generic (PLEG): container finished" podID="49fea8fc-372c-4cf8-a710-7fff58db294d" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" exitCode=0 Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404428 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerDied","Data":"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288"} Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404455 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" event={"ID":"49fea8fc-372c-4cf8-a710-7fff58db294d","Type":"ContainerDied","Data":"bad9604b0b4948880032fa39bf17f3a7090e87a9e7998fc75541062cf197564a"} Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404471 4898 scope.go:117] "RemoveContainer" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.404602 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b946c75cc-wnv9x" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.431953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.434598 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.442093 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.442305 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.442459 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tdc5n" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.445979 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.454788 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.454853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.454874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.461013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerStarted","Data":"e9c150eab9ccbad529dce767553bfd6f06b4b8420400e4a2a3ec291dc3f4b819"} Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.490781 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.499478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"cinder-4e00-account-create-update-92bgz\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.541137 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.542469 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.552425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.563572 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.569282 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.569325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.569600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.595955 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.598730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.602399 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.608267 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.612265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.620419 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.625643 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"keystone-db-sync-dxpl9\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.626890 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.628104 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.640307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.668091 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.669940 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.672270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.676989 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677170 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677228 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.677327 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.678216 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.682008 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.722697 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.724611 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.730553 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.761020 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.775345 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780409 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780560 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780817 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.780867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781125 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781233 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.781672 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.783466 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.784376 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.796199 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.811335 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.813084 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.819795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"neutron-db-create-95vbj\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.819989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"heat-9d98-account-create-update-t77sf\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.821557 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"heat-db-create-275nk\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " pod="openstack/heat-db-create-275nk" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.833026 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.846831 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-537e992e-0c7e-4e28-8105-b535a72a793c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-537e992e-0c7e-4e28-8105-b535a72a793c\") pod \"prometheus-metric-storage-0\" (UID: \"d555bd54-f4d5-4b06-9517-32b4fe687f4b\") " pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.861753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config" (OuterVolumeSpecName: "config") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.870211 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886227 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886364 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886423 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886456 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886503 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886635 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.886654 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.887088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.887713 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.898652 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49fea8fc-372c-4cf8-a710-7fff58db294d" (UID: "49fea8fc-372c-4cf8-a710-7fff58db294d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.940775 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"neutron-1082-account-create-update-2jjkd\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.957821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"barbican-23f7-account-create-update-z479t\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.988143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.988706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.988934 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49fea8fc-372c-4cf8-a710-7fff58db294d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:20 crc kubenswrapper[4898]: I0313 14:20:20.989033 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.001852 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.008349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"barbican-db-create-452kj\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " pod="openstack/barbican-db-create-452kj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.024155 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.122448 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.123145 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.132284 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b946c75cc-wnv9x"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.143259 4898 scope.go:117] "RemoveContainer" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.151997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.181690 4898 scope.go:117] "RemoveContainer" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" Mar 13 14:20:21 crc kubenswrapper[4898]: E0313 14:20:21.185701 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288\": container with ID starting with 8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288 not found: ID does not exist" containerID="8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.185739 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288"} err="failed to get container status \"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288\": rpc error: code = NotFound desc = could not find container \"8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288\": container with ID starting with 8147ab7966f5020767fcc27772ab175b42f8528278c1d7d00f8082b71c3d4288 not found: ID does not exist" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.185761 4898 scope.go:117] "RemoveContainer" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" Mar 13 14:20:21 crc kubenswrapper[4898]: E0313 14:20:21.186949 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb\": container with ID starting with f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb not found: ID does not exist" containerID="f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.187008 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb"} err="failed to get container status \"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb\": rpc error: code = NotFound desc = could not find container \"f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb\": container with ID starting with f6416d1e5f0d5a80d9a232b6c88402bfa85e5da16423a5a28fb3cfb07228cafb not found: ID does not exist" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.196977 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.199144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.219345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.264265 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.282175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.309842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:20:21 crc kubenswrapper[4898]: W0313 14:20:21.325047 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a8516c_5aee_4eae_a59b_498f97c1b92b.slice/crio-0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c WatchSource:0}: Error finding container 0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c: Status 404 returned error can't find the container with id 0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.520498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-88gdv" event={"ID":"f8a8516c-5aee-4eae-a59b-498f97c1b92b","Type":"ContainerStarted","Data":"0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.530938 4898 generic.go:334] "Generic (PLEG): container finished" podID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerID="d7d9b7775bd2555a7c4636350359292fb8c65661ac123c18de4ec1ec3c6ad5d5" exitCode=0 Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.531030 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerDied","Data":"d7d9b7775bd2555a7c4636350359292fb8c65661ac123c18de4ec1ec3c6ad5d5"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.542932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerStarted","Data":"c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.542970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerStarted","Data":"bb908e3bf15681fcffb32fcf09fd96e847d58d3dc20f8731e8a57f7767257538"} Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.624266 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-4e00-account-create-update-92bgz" podStartSLOduration=2.624246574 podStartE2EDuration="2.624246574s" podCreationTimestamp="2026-03-13 14:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:21.602344465 +0000 UTC m=+1456.603932724" watchObservedRunningTime="2026-03-13 14:20:21.624246574 +0000 UTC m=+1456.625834813" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.704466 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.769055 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fea8fc-372c-4cf8-a710-7fff58db294d" path="/var/lib/kubelet/pods/49fea8fc-372c-4cf8-a710-7fff58db294d/volumes" Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.811012 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:20:21 crc kubenswrapper[4898]: I0313 14:20:21.929486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.118308 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:20:22 crc kubenswrapper[4898]: E0313 14:20:22.346583 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe3416e_f08a_43c9_8e12_a89c1e849208.slice/crio-c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fe3416e_f08a_43c9_8e12_a89c1e849208.slice/crio-conmon-c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8a8516c_5aee_4eae_a59b_498f97c1b92b.slice/crio-conmon-4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:20:22 crc kubenswrapper[4898]: W0313 14:20:22.380655 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbea88065_1eff_42e2_809a_443c15bda0ac.slice/crio-c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17 WatchSource:0}: Error finding container c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17: Status 404 returned error can't find the container with id c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.394977 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.426085 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.548724 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.567611 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.570760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-95vbj" event={"ID":"b83b860f-ed6c-46b2-862a-fbda9af7dc89","Type":"ContainerStarted","Data":"a7fdcb25a0c057c7c997cd177fd4771ba973fb7faff5931de0e11cf08d037dc6"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.579562 4898 generic.go:334] "Generic (PLEG): container finished" podID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerID="860f247abf99986a680fa9cbd71b3ddb7e0e1a4bc671f3a1ca2277312ff69005" exitCode=0 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.579643 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-275nk" event={"ID":"b04d3edd-a550-465a-9ef2-2cbea4126ceb","Type":"ContainerDied","Data":"860f247abf99986a680fa9cbd71b3ddb7e0e1a4bc671f3a1ca2277312ff69005"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.579669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-275nk" event={"ID":"b04d3edd-a550-465a-9ef2-2cbea4126ceb","Type":"ContainerStarted","Data":"95bbd9bdb5e436c19160a59956edc16f6c2aa59b93b3dae88e1b3df084a217fb"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.587727 4898 generic.go:334] "Generic (PLEG): container finished" podID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerID="4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e" exitCode=0 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.587828 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-88gdv" event={"ID":"f8a8516c-5aee-4eae-a59b-498f97c1b92b","Type":"ContainerDied","Data":"4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.604633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerStarted","Data":"5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.604702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerStarted","Data":"32ad0f4bc955a1c213252ea528d35e7b749f8d4ec4d59c71217cfbe39f2386ae"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.607716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerStarted","Data":"ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.607881 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.612491 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerStarted","Data":"4a05de9aa49849a88ec18bd7a217b2e7749103f27e3a95daf9185dac7265effb"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.619748 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"394a9c444dfa8cc8c8fe8e8d08721ac7bdec485308f43f3ae940f284836aa510"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.623132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-452kj" event={"ID":"bea88065-1eff-42e2-809a-443c15bda0ac","Type":"ContainerStarted","Data":"c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.625997 4898 generic.go:334] "Generic (PLEG): container finished" podID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerID="c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636" exitCode=0 Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.626044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerDied","Data":"c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636"} Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.654397 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podStartSLOduration=4.654382405 podStartE2EDuration="4.654382405s" podCreationTimestamp="2026-03-13 14:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:22.647598169 +0000 UTC m=+1457.649186418" watchObservedRunningTime="2026-03-13 14:20:22.654382405 +0000 UTC m=+1457.655970644" Mar 13 14:20:22 crc kubenswrapper[4898]: I0313 14:20:22.685645 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-9d98-account-create-update-t77sf" podStartSLOduration=2.685625186 podStartE2EDuration="2.685625186s" podCreationTimestamp="2026-03-13 14:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:22.665951015 +0000 UTC m=+1457.667539264" watchObservedRunningTime="2026-03-13 14:20:22.685625186 +0000 UTC m=+1457.687213425" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.655332 4898 generic.go:334] "Generic (PLEG): container finished" podID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerID="3c9e54e03326bbf4751dd95fa7e9b1825e9af82b1eefdf09759081304dd57de9" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.655426 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-95vbj" event={"ID":"b83b860f-ed6c-46b2-862a-fbda9af7dc89","Type":"ContainerDied","Data":"3c9e54e03326bbf4751dd95fa7e9b1825e9af82b1eefdf09759081304dd57de9"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.657687 4898 generic.go:334] "Generic (PLEG): container finished" podID="bea88065-1eff-42e2-809a-443c15bda0ac" containerID="9d4412724b9aeab2fb38e3b120d6b80b5959d7a8a33631247f92a023f5b56a70" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.657735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-452kj" event={"ID":"bea88065-1eff-42e2-809a-443c15bda0ac","Type":"ContainerDied","Data":"9d4412724b9aeab2fb38e3b120d6b80b5959d7a8a33631247f92a023f5b56a70"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.662851 4898 generic.go:334] "Generic (PLEG): container finished" podID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerID="7890927e1d3da3f2b5ae266b74631a44cf3eea829b7cc9b79f5ffe9476b7f6a0" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.662945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1082-account-create-update-2jjkd" event={"ID":"71459d1c-2acb-4e15-a30d-09dd0f7f7951","Type":"ContainerDied","Data":"7890927e1d3da3f2b5ae266b74631a44cf3eea829b7cc9b79f5ffe9476b7f6a0"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.662971 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1082-account-create-update-2jjkd" event={"ID":"71459d1c-2acb-4e15-a30d-09dd0f7f7951","Type":"ContainerStarted","Data":"fb97c5d92d819436e097d301a10d6e1823950e95217ea57b4fa7ae57304a395a"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.664357 4898 generic.go:334] "Generic (PLEG): container finished" podID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerID="5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.664398 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerDied","Data":"5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.665977 4898 generic.go:334] "Generic (PLEG): container finished" podID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerID="89083a9a998b87f99f34c5645b2e669a886f2f7b20e68795829134c5acb24ac6" exitCode=0 Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.666020 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-23f7-account-create-update-z479t" event={"ID":"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d","Type":"ContainerDied","Data":"89083a9a998b87f99f34c5645b2e669a886f2f7b20e68795829134c5acb24ac6"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.666035 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-23f7-account-create-update-z479t" event={"ID":"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d","Type":"ContainerStarted","Data":"49bd09daaa7c514816a284920d5932365e68f9d24f10691ae4161b7348f99fc9"} Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.883068 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.884942 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.886830 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.891256 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.990652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:23 crc kubenswrapper[4898]: I0313 14:20:23.991176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.093371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.093465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.093996 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.238428 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"root-account-create-update-pxtss\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.368010 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.374681 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.393039 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") pod \"4fe3416e-f08a-43c9-8e12-a89c1e849208\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500702 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") pod \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500777 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") pod \"4fe3416e-f08a-43c9-8e12-a89c1e849208\" (UID: \"4fe3416e-f08a-43c9-8e12-a89c1e849208\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") pod \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\" (UID: \"b04d3edd-a550-465a-9ef2-2cbea4126ceb\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") pod \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.500923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") pod \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\" (UID: \"f8a8516c-5aee-4eae-a59b-498f97c1b92b\") " Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.501488 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4fe3416e-f08a-43c9-8e12-a89c1e849208" (UID: "4fe3416e-f08a-43c9-8e12-a89c1e849208"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.501485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b04d3edd-a550-465a-9ef2-2cbea4126ceb" (UID: "b04d3edd-a550-465a-9ef2-2cbea4126ceb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.501557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8a8516c-5aee-4eae-a59b-498f97c1b92b" (UID: "f8a8516c-5aee-4eae-a59b-498f97c1b92b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.508288 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7" (OuterVolumeSpecName: "kube-api-access-p2wn7") pod "b04d3edd-a550-465a-9ef2-2cbea4126ceb" (UID: "b04d3edd-a550-465a-9ef2-2cbea4126ceb"). InnerVolumeSpecName "kube-api-access-p2wn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.508644 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.508782 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr" (OuterVolumeSpecName: "kube-api-access-2bzvr") pod "f8a8516c-5aee-4eae-a59b-498f97c1b92b" (UID: "f8a8516c-5aee-4eae-a59b-498f97c1b92b"). InnerVolumeSpecName "kube-api-access-2bzvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.532053 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf" (OuterVolumeSpecName: "kube-api-access-w5xdf") pod "4fe3416e-f08a-43c9-8e12-a89c1e849208" (UID: "4fe3416e-f08a-43c9-8e12-a89c1e849208"). InnerVolumeSpecName "kube-api-access-w5xdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603158 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b04d3edd-a550-465a-9ef2-2cbea4126ceb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603205 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4fe3416e-f08a-43c9-8e12-a89c1e849208-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603219 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2wn7\" (UniqueName: \"kubernetes.io/projected/b04d3edd-a550-465a-9ef2-2cbea4126ceb-kube-api-access-p2wn7\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603238 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8a8516c-5aee-4eae-a59b-498f97c1b92b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603252 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bzvr\" (UniqueName: \"kubernetes.io/projected/f8a8516c-5aee-4eae-a59b-498f97c1b92b-kube-api-access-2bzvr\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.603265 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5xdf\" (UniqueName: \"kubernetes.io/projected/4fe3416e-f08a-43c9-8e12-a89c1e849208-kube-api-access-w5xdf\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.679584 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-4e00-account-create-update-92bgz" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.679535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-4e00-account-create-update-92bgz" event={"ID":"4fe3416e-f08a-43c9-8e12-a89c1e849208","Type":"ContainerDied","Data":"bb908e3bf15681fcffb32fcf09fd96e847d58d3dc20f8731e8a57f7767257538"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.679715 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb908e3bf15681fcffb32fcf09fd96e847d58d3dc20f8731e8a57f7767257538" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.682329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-88gdv" event={"ID":"f8a8516c-5aee-4eae-a59b-498f97c1b92b","Type":"ContainerDied","Data":"0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.682358 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f13e48b7821712cb1bba68d5ed2ca19eb5fbddf9b7a29ac27b157d895fe2e3c" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.682398 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-88gdv" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.694983 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"3c90dc434ae18a1a3a436a05040e009b821f261e42f9834113cbf08fabe109e3"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.700435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-275nk" event={"ID":"b04d3edd-a550-465a-9ef2-2cbea4126ceb","Type":"ContainerDied","Data":"95bbd9bdb5e436c19160a59956edc16f6c2aa59b93b3dae88e1b3df084a217fb"} Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.700481 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95bbd9bdb5e436c19160a59956edc16f6c2aa59b93b3dae88e1b3df084a217fb" Mar 13 14:20:24 crc kubenswrapper[4898]: I0313 14:20:24.700503 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-275nk" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.659449 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.664356 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.680361 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.708085 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.721729 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.751757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") pod \"1aa06f21-2d35-4d03-86b9-01d9354826da\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.751979 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") pod \"bea88065-1eff-42e2-809a-443c15bda0ac\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752096 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") pod \"1aa06f21-2d35-4d03-86b9-01d9354826da\" (UID: \"1aa06f21-2d35-4d03-86b9-01d9354826da\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") pod \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") pod \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") pod \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752389 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") pod \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\" (UID: \"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752471 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") pod \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\" (UID: \"b83b860f-ed6c-46b2-862a-fbda9af7dc89\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") pod \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\" (UID: \"71459d1c-2acb-4e15-a30d-09dd0f7f7951\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.752662 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") pod \"bea88065-1eff-42e2-809a-443c15bda0ac\" (UID: \"bea88065-1eff-42e2-809a-443c15bda0ac\") " Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.754760 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1aa06f21-2d35-4d03-86b9-01d9354826da" (UID: "1aa06f21-2d35-4d03-86b9-01d9354826da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.755193 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" (UID: "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.756550 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b83b860f-ed6c-46b2-862a-fbda9af7dc89" (UID: "b83b860f-ed6c-46b2-862a-fbda9af7dc89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.759489 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bea88065-1eff-42e2-809a-443c15bda0ac" (UID: "bea88065-1eff-42e2-809a-443c15bda0ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.762450 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5" (OuterVolumeSpecName: "kube-api-access-xkkr5") pod "bea88065-1eff-42e2-809a-443c15bda0ac" (UID: "bea88065-1eff-42e2-809a-443c15bda0ac"). InnerVolumeSpecName "kube-api-access-xkkr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.762618 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd" (OuterVolumeSpecName: "kube-api-access-xg2zd") pod "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" (UID: "f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d"). InnerVolumeSpecName "kube-api-access-xg2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768627 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkkr5\" (UniqueName: \"kubernetes.io/projected/bea88065-1eff-42e2-809a-443c15bda0ac-kube-api-access-xkkr5\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768664 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa06f21-2d35-4d03-86b9-01d9354826da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768685 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b83b860f-ed6c-46b2-862a-fbda9af7dc89-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768697 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg2zd\" (UniqueName: \"kubernetes.io/projected/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-kube-api-access-xg2zd\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768710 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.768723 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bea88065-1eff-42e2-809a-443c15bda0ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774103 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-452kj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-452kj" event={"ID":"bea88065-1eff-42e2-809a-443c15bda0ac","Type":"ContainerDied","Data":"c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774219 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2d35d9d1f6a33fbdc7f1c2681ee05eeaa35e95f2c8bc22309c279e2bc914c17" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.774574 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm" (OuterVolumeSpecName: "kube-api-access-crvxm") pod "b83b860f-ed6c-46b2-862a-fbda9af7dc89" (UID: "b83b860f-ed6c-46b2-862a-fbda9af7dc89"). InnerVolumeSpecName "kube-api-access-crvxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.778748 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1082-account-create-update-2jjkd" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.781179 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9d98-account-create-update-t77sf" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.782174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71459d1c-2acb-4e15-a30d-09dd0f7f7951" (UID: "71459d1c-2acb-4e15-a30d-09dd0f7f7951"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.783912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1082-account-create-update-2jjkd" event={"ID":"71459d1c-2acb-4e15-a30d-09dd0f7f7951","Type":"ContainerDied","Data":"fb97c5d92d819436e097d301a10d6e1823950e95217ea57b4fa7ae57304a395a"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.783948 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb97c5d92d819436e097d301a10d6e1823950e95217ea57b4fa7ae57304a395a" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.783998 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9d98-account-create-update-t77sf" event={"ID":"1aa06f21-2d35-4d03-86b9-01d9354826da","Type":"ContainerDied","Data":"32ad0f4bc955a1c213252ea528d35e7b749f8d4ec4d59c71217cfbe39f2386ae"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.784009 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32ad0f4bc955a1c213252ea528d35e7b749f8d4ec4d59c71217cfbe39f2386ae" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.785421 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-23f7-account-create-update-z479t" event={"ID":"f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d","Type":"ContainerDied","Data":"49bd09daaa7c514816a284920d5932365e68f9d24f10691ae4161b7348f99fc9"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.785460 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49bd09daaa7c514816a284920d5932365e68f9d24f10691ae4161b7348f99fc9" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.785532 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-23f7-account-create-update-z479t" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.791939 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-95vbj" event={"ID":"b83b860f-ed6c-46b2-862a-fbda9af7dc89","Type":"ContainerDied","Data":"a7fdcb25a0c057c7c997cd177fd4771ba973fb7faff5931de0e11cf08d037dc6"} Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.791989 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fdcb25a0c057c7c997cd177fd4771ba973fb7faff5931de0e11cf08d037dc6" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.792082 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-95vbj" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.793045 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt" (OuterVolumeSpecName: "kube-api-access-9sxnt") pod "1aa06f21-2d35-4d03-86b9-01d9354826da" (UID: "1aa06f21-2d35-4d03-86b9-01d9354826da"). InnerVolumeSpecName "kube-api-access-9sxnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.795811 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97" (OuterVolumeSpecName: "kube-api-access-4lc97") pod "71459d1c-2acb-4e15-a30d-09dd0f7f7951" (UID: "71459d1c-2acb-4e15-a30d-09dd0f7f7951"). InnerVolumeSpecName "kube-api-access-4lc97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870250 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crvxm\" (UniqueName: \"kubernetes.io/projected/b83b860f-ed6c-46b2-862a-fbda9af7dc89-kube-api-access-crvxm\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870278 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71459d1c-2acb-4e15-a30d-09dd0f7f7951-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870288 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sxnt\" (UniqueName: \"kubernetes.io/projected/1aa06f21-2d35-4d03-86b9-01d9354826da-kube-api-access-9sxnt\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:28 crc kubenswrapper[4898]: I0313 14:20:28.870296 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lc97\" (UniqueName: \"kubernetes.io/projected/71459d1c-2acb-4e15-a30d-09dd0f7f7951-kube-api-access-4lc97\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.006321 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.086121 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.226161 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.226739 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-2sp5q" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" containerID="cri-o://50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7" gracePeriod=10 Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.830879 4898 generic.go:334] "Generic (PLEG): container finished" podID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerID="58affd3294e9aec78373844bf6912651079de0e76c0d060a1cf7a048a7bc787d" exitCode=0 Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.831253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pxtss" event={"ID":"74eb351d-364c-4564-8f8b-67ac844a6abc","Type":"ContainerDied","Data":"58affd3294e9aec78373844bf6912651079de0e76c0d060a1cf7a048a7bc787d"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.831307 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pxtss" event={"ID":"74eb351d-364c-4564-8f8b-67ac844a6abc","Type":"ContainerStarted","Data":"8d0d979c82cb36af5b7db53315fdb00e1e53cc947a1bb29c77b9196d7df5e3d4"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.834096 4898 generic.go:334] "Generic (PLEG): container finished" podID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerID="50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7" exitCode=0 Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.834186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerDied","Data":"50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.849088 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerStarted","Data":"f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505"} Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.936344 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:20:29 crc kubenswrapper[4898]: I0313 14:20:29.958034 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-dxpl9" podStartSLOduration=2.621679706 podStartE2EDuration="9.958009074s" podCreationTimestamp="2026-03-13 14:20:20 +0000 UTC" firstStartedPulling="2026-03-13 14:20:21.798091277 +0000 UTC m=+1456.799679516" lastFinishedPulling="2026-03-13 14:20:29.134420645 +0000 UTC m=+1464.136008884" observedRunningTime="2026-03-13 14:20:29.886317173 +0000 UTC m=+1464.887905412" watchObservedRunningTime="2026-03-13 14:20:29.958009074 +0000 UTC m=+1464.959597323" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003346 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.003701 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") pod \"a37db268-4fcb-45a7-a7bf-fae19a514257\" (UID: \"a37db268-4fcb-45a7-a7bf-fae19a514257\") " Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.008509 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842" (OuterVolumeSpecName: "kube-api-access-bw842") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "kube-api-access-bw842". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.061629 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config" (OuterVolumeSpecName: "config") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.072174 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.072365 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.081558 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a37db268-4fcb-45a7-a7bf-fae19a514257" (UID: "a37db268-4fcb-45a7-a7bf-fae19a514257"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105554 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105581 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105590 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105599 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw842\" (UniqueName: \"kubernetes.io/projected/a37db268-4fcb-45a7-a7bf-fae19a514257-kube-api-access-bw842\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.105610 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a37db268-4fcb-45a7-a7bf-fae19a514257-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.872736 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-2sp5q" event={"ID":"a37db268-4fcb-45a7-a7bf-fae19a514257","Type":"ContainerDied","Data":"da5d3ec372689f6af3cb9e875471efd89602096089c49f7ef2acb131bc222cb5"} Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.873010 4898 scope.go:117] "RemoveContainer" containerID="50431c87d4fda7b6d1207e4343981bd67d4f748124f89954c845f5c4fb0d25f7" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.872768 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-2sp5q" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.917373 4898 scope.go:117] "RemoveContainer" containerID="de23e3ccb82eedd2170e1cd3b17cd796af8186141a4e3a6a0df25cf87c1ac689" Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.925525 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:20:30 crc kubenswrapper[4898]: I0313 14:20:30.937915 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-2sp5q"] Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.391638 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.429833 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") pod \"74eb351d-364c-4564-8f8b-67ac844a6abc\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.430077 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") pod \"74eb351d-364c-4564-8f8b-67ac844a6abc\" (UID: \"74eb351d-364c-4564-8f8b-67ac844a6abc\") " Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.430606 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "74eb351d-364c-4564-8f8b-67ac844a6abc" (UID: "74eb351d-364c-4564-8f8b-67ac844a6abc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.451356 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj" (OuterVolumeSpecName: "kube-api-access-f4tzj") pod "74eb351d-364c-4564-8f8b-67ac844a6abc" (UID: "74eb351d-364c-4564-8f8b-67ac844a6abc"). InnerVolumeSpecName "kube-api-access-f4tzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.531956 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4tzj\" (UniqueName: \"kubernetes.io/projected/74eb351d-364c-4564-8f8b-67ac844a6abc-kube-api-access-f4tzj\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.531990 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/74eb351d-364c-4564-8f8b-67ac844a6abc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.753994 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" path="/var/lib/kubelet/pods/a37db268-4fcb-45a7-a7bf-fae19a514257/volumes" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.883360 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pxtss" event={"ID":"74eb351d-364c-4564-8f8b-67ac844a6abc","Type":"ContainerDied","Data":"8d0d979c82cb36af5b7db53315fdb00e1e53cc947a1bb29c77b9196d7df5e3d4"} Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.883726 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d0d979c82cb36af5b7db53315fdb00e1e53cc947a1bb29c77b9196d7df5e3d4" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.883399 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pxtss" Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.888191 4898 generic.go:334] "Generic (PLEG): container finished" podID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerID="3c90dc434ae18a1a3a436a05040e009b821f261e42f9834113cbf08fabe109e3" exitCode=0 Mar 13 14:20:31 crc kubenswrapper[4898]: I0313 14:20:31.888229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerDied","Data":"3c90dc434ae18a1a3a436a05040e009b821f261e42f9834113cbf08fabe109e3"} Mar 13 14:20:32 crc kubenswrapper[4898]: I0313 14:20:32.908029 4898 generic.go:334] "Generic (PLEG): container finished" podID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerID="f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505" exitCode=0 Mar 13 14:20:32 crc kubenswrapper[4898]: I0313 14:20:32.908066 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerDied","Data":"f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505"} Mar 13 14:20:32 crc kubenswrapper[4898]: I0313 14:20:32.911067 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"1ca40490280f990739faac6ee40000e80b9bd47d5063a8c6cda8773de600017d"} Mar 13 14:20:33 crc kubenswrapper[4898]: I0313 14:20:33.159968 4898 scope.go:117] "RemoveContainer" containerID="ad86fe4efa1fa3496cbed8d6aa93dada393bba49f5fe2d8062f7e0508875ea38" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.341108 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.516361 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") pod \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.516483 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") pod \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.516658 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") pod \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\" (UID: \"8fdab36c-41db-4a9c-9cbe-47e1761c6df5\") " Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.525004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476" (OuterVolumeSpecName: "kube-api-access-bd476") pod "8fdab36c-41db-4a9c-9cbe-47e1761c6df5" (UID: "8fdab36c-41db-4a9c-9cbe-47e1761c6df5"). InnerVolumeSpecName "kube-api-access-bd476". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.547531 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fdab36c-41db-4a9c-9cbe-47e1761c6df5" (UID: "8fdab36c-41db-4a9c-9cbe-47e1761c6df5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.582375 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data" (OuterVolumeSpecName: "config-data") pod "8fdab36c-41db-4a9c-9cbe-47e1761c6df5" (UID: "8fdab36c-41db-4a9c-9cbe-47e1761c6df5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.619632 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd476\" (UniqueName: \"kubernetes.io/projected/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-kube-api-access-bd476\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.619671 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.619686 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fdab36c-41db-4a9c-9cbe-47e1761c6df5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.938666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-dxpl9" event={"ID":"8fdab36c-41db-4a9c-9cbe-47e1761c6df5","Type":"ContainerDied","Data":"4a05de9aa49849a88ec18bd7a217b2e7749103f27e3a95daf9185dac7265effb"} Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.938706 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a05de9aa49849a88ec18bd7a217b2e7749103f27e3a95daf9185dac7265effb" Mar 13 14:20:34 crc kubenswrapper[4898]: I0313 14:20:34.938717 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-dxpl9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.269990 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270808 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270833 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270844 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerName="keystone-db-sync" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270851 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerName="keystone-db-sync" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270864 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270873 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270890 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="init" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270914 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="init" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270933 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270941 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270952 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270960 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270978 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.270986 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.270997 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271004 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271026 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271034 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271053 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271062 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271070 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271078 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" Mar 13 14:20:35 crc kubenswrapper[4898]: E0313 14:20:35.271093 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271102 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271336 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a37db268-4fcb-45a7-a7bf-fae19a514257" containerName="dnsmasq-dns" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271355 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271372 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271383 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271393 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271410 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271422 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271438 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271451 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" containerName="mariadb-account-create-update" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271462 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" containerName="mariadb-database-create" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.271473 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" containerName="keystone-db-sync" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.272364 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279449 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279541 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279653 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tdc5n" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.279711 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.285201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.286976 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.310644 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.322031 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.334395 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.398513 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.406936 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.411882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-d5fsz" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.411920 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.418934 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442734 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442752 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442769 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442847 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442923 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442949 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.442979 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545238 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545289 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545314 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545329 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545357 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545437 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545531 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.545686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.546622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.547343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.547780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.548045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.548331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.553797 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.560465 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.570542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.580239 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.580753 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.606170 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"keystone-bootstrap-mc8t9\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.622030 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.623229 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"dnsmasq-dns-847c4cc679-jcdtp\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.654396 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.655810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.655857 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.655917 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.664347 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.666108 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.666733 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.692467 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.701931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dcn2n" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.702349 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.702474 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.722993 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"heat-db-sync-zgt75\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.730965 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.732313 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.742270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.742451 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-db2fv" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.742611 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.804299 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.835593 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.835795 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.922887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923161 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923217 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923334 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923399 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:35 crc kubenswrapper[4898]: I0313 14:20:35.923726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.015221 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.023833 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.030758 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.030950 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rkpbr" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034432 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034549 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034571 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034623 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034736 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.034938 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.035078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.035226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.071605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.079637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.080230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.101388 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.110284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"neutron-db-sync-hm77q\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.113337 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.114761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.116663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"cinder-db-sync-ztp6c\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.127949 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"578089192ca9537b8bb073d8f2c027027dc4036f5b750f9a009d81f2e4fe812e"} Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137491 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.137712 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.155771 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.157250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.158800 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.160066 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-qr6vd" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.160289 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.170978 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.183188 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.195836 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.197856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.205239 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239173 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239807 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.239889 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.240068 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.240151 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.257332 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.258780 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.269085 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"barbican-db-sync-xq6ss\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.272881 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.287945 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.342922 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.342967 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.342993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343018 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343169 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343219 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.343343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.346327 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.347923 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.368709 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.369470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.369659 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"placement-db-sync-dddqm\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.418957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.420730 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.429962 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t4f8x" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.430345 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.430479 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.430590 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445150 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445222 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.445373 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.446556 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.446608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.447861 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.448382 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.448728 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.448965 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.449940 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.460742 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.503870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.512874 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.520008 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"dnsmasq-dns-785d8bcb8c-fbs4f\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.521983 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.525126 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.525249 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.527526 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.527794 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550304 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550364 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550397 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550429 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550493 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550539 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.550566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.629890 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.654505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.654731 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.654837 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655442 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655442 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655543 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655661 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.655914 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656051 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656113 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656286 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656794 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656826 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656869 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.656957 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.661447 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.661456 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.662773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.664595 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.664628 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1810628263decbcb8d9790a46f0a2a80fe37ecdd6e2a4c05137bd112c0de5f67/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.674054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.687067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768923 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.768985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769074 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.769853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.770824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.779623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.780741 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.782524 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.782563 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e32bfaa798492a506ddfd6dd81603c6b252f4a286c98ba8256226389647f45c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.800987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.801257 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.801637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.803727 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.807065 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.807249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.832030 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.835471 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.837825 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872111 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872206 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872242 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872274 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.872349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.886627 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.904529 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.973832 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974332 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974408 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974460 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.974501 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.975768 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.975997 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.977734 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.980813 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.986994 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.992005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:36 crc kubenswrapper[4898]: I0313 14:20:36.995393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"ceilometer-0\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " pod="openstack/ceilometer-0" Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.059056 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.069525 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.098265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.162253 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.207480 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" event={"ID":"10787742-bffc-4545-95cc-8f0354246d7c","Type":"ContainerStarted","Data":"f5c205239fce65effb688de2367793807b3add254d60c501123275b62220d66b"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.229934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerStarted","Data":"ff8a73d5234eb1ed4542baaf925a5bae9eff511012c73d58ac5c330a7c07d613"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.237144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerStarted","Data":"862d777be88dbcc89854ce3dd00093f5671b014a3c866a71f6f944e62cffdd2c"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.240005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.241580 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d555bd54-f4d5-4b06-9517-32b4fe687f4b","Type":"ContainerStarted","Data":"110d56f13170c1319b21d99b74fd32c1fea939cfc51a2e23b13c200d65b1e7fc"} Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.263406 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.289929 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.289908257 podStartE2EDuration="18.289908257s" podCreationTimestamp="2026-03-13 14:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:37.271003536 +0000 UTC m=+1472.272591805" watchObservedRunningTime="2026-03-13 14:20:37.289908257 +0000 UTC m=+1472.291496496" Mar 13 14:20:37 crc kubenswrapper[4898]: W0313 14:20:37.680042 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51a3e0c5_0084_4216_a162_3614eafcc162.slice/crio-4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d WatchSource:0}: Error finding container 4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d: Status 404 returned error can't find the container with id 4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.686720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.734208 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.760875 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.896955 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.897024 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:37 crc kubenswrapper[4898]: I0313 14:20:37.905518 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.166427 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: W0313 14:20:38.220281 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda92e3d93_9b4b_4e81_84dc_5cf7aa23f96c.slice/crio-52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e WatchSource:0}: Error finding container 52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e: Status 404 returned error can't find the container with id 52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.308784 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.309744 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerStarted","Data":"bfe3b6cf0e5928312929ea860aeb7b7f643553f3479a3beac4f364f3ff4502ae"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.311451 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerStarted","Data":"4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.345213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerStarted","Data":"a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.345256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerStarted","Data":"1f8aa6f56c769252ca6f9fa29c34832b03b0bd31e4320e8506b18e27d01c86a7"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.348658 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.398156 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerStarted","Data":"6d9f47030df3927c7cd3d661b2414ebcd6b45bbd0a337a5460ac32062ae1c8cc"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.472774 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.481162 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerStarted","Data":"52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.510881 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-hm77q" podStartSLOduration=3.5108612519999998 podStartE2EDuration="3.510861252s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:38.458297308 +0000 UTC m=+1473.459885547" watchObservedRunningTime="2026-03-13 14:20:38.510861252 +0000 UTC m=+1473.512449491" Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.599091 4898 generic.go:334] "Generic (PLEG): container finished" podID="10787742-bffc-4545-95cc-8f0354246d7c" containerID="f3696d82d52d862da7bc6bea3b377d64088527f42e2ee331d8b5ccc92226da57" exitCode=0 Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.599267 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" event={"ID":"10787742-bffc-4545-95cc-8f0354246d7c","Type":"ContainerDied","Data":"f3696d82d52d862da7bc6bea3b377d64088527f42e2ee331d8b5ccc92226da57"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.613868 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerStarted","Data":"bafefd0ee86b1967f73e5ee3d1256b9f0f1d84430cc94cf6628cfc6e827e8aad"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.641777 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerStarted","Data":"e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0"} Mar 13 14:20:38 crc kubenswrapper[4898]: I0313 14:20:38.666443 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mc8t9" podStartSLOduration=3.666420001 podStartE2EDuration="3.666420001s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:38.661359249 +0000 UTC m=+1473.662947488" watchObservedRunningTime="2026-03-13 14:20:38.666420001 +0000 UTC m=+1473.668008240" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.359987 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.493622 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494203 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494244 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494284 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494423 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.494568 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.500447 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk" (OuterVolumeSpecName: "kube-api-access-sd4hk") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "kube-api-access-sd4hk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.538485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.550616 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.583678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.598847 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config" (OuterVolumeSpecName: "config") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.598942 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") pod \"10787742-bffc-4545-95cc-8f0354246d7c\" (UID: \"10787742-bffc-4545-95cc-8f0354246d7c\") " Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599876 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd4hk\" (UniqueName: \"kubernetes.io/projected/10787742-bffc-4545-95cc-8f0354246d7c-kube-api-access-sd4hk\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599889 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599910 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.599919 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: W0313 14:20:39.600123 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/10787742-bffc-4545-95cc-8f0354246d7c/volumes/kubernetes.io~configmap/config Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.600135 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config" (OuterVolumeSpecName: "config") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.605224 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10787742-bffc-4545-95cc-8f0354246d7c" (UID: "10787742-bffc-4545-95cc-8f0354246d7c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.686195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerStarted","Data":"0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.688223 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"5fd3c6beb630c0a00d78ffbb0eaf96e2717f61918bd632f3618c99bd721d1714"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.690699 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" event={"ID":"10787742-bffc-4545-95cc-8f0354246d7c","Type":"ContainerDied","Data":"f5c205239fce65effb688de2367793807b3add254d60c501123275b62220d66b"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.690733 4898 scope.go:117] "RemoveContainer" containerID="f3696d82d52d862da7bc6bea3b377d64088527f42e2ee331d8b5ccc92226da57" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.690847 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.702413 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.702448 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10787742-bffc-4545-95cc-8f0354246d7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.711425 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerStarted","Data":"55281eef0a29c186bce67ca4ff84ee09f28c634f48016b0a99b76a64d0ade9af"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.729636 4898 generic.go:334] "Generic (PLEG): container finished" podID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerID="a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c" exitCode=0 Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.729706 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerDied","Data":"a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.729763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerStarted","Data":"b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5"} Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.730917 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.768533 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" podStartSLOduration=4.7685112610000004 podStartE2EDuration="4.768511261s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:39.751150531 +0000 UTC m=+1474.752738770" watchObservedRunningTime="2026-03-13 14:20:39.768511261 +0000 UTC m=+1474.770099500" Mar 13 14:20:39 crc kubenswrapper[4898]: I0313 14:20:39.784162 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerStarted","Data":"d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2"} Mar 13 14:20:40 crc kubenswrapper[4898]: I0313 14:20:40.761171 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerStarted","Data":"5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250"} Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.002450 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.788328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerStarted","Data":"9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed"} Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.788555 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" containerID="cri-o://9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.788485 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" containerID="cri-o://5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.792673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerStarted","Data":"dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60"} Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.792782 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" containerID="cri-o://0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.792800 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" containerID="cri-o://dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60" gracePeriod=30 Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.814151 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.814127044 podStartE2EDuration="6.814127044s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:41.81321551 +0000 UTC m=+1476.814803749" watchObservedRunningTime="2026-03-13 14:20:41.814127044 +0000 UTC m=+1476.815715283" Mar 13 14:20:41 crc kubenswrapper[4898]: I0313 14:20:41.858436 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.858417114 podStartE2EDuration="6.858417114s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:20:41.847737497 +0000 UTC m=+1476.849325756" watchObservedRunningTime="2026-03-13 14:20:41.858417114 +0000 UTC m=+1476.860005353" Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.805800 4898 generic.go:334] "Generic (PLEG): container finished" podID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerID="9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed" exitCode=0 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.807404 4898 generic.go:334] "Generic (PLEG): container finished" podID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerID="5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250" exitCode=143 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.805933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerDied","Data":"9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.807637 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerDied","Data":"5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811112 4898 generic.go:334] "Generic (PLEG): container finished" podID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerID="dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60" exitCode=0 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811139 4898 generic.go:334] "Generic (PLEG): container finished" podID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerID="0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8" exitCode=143 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerDied","Data":"dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.811205 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerDied","Data":"0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8"} Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.813122 4898 generic.go:334] "Generic (PLEG): container finished" podID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerID="e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0" exitCode=0 Mar 13 14:20:42 crc kubenswrapper[4898]: I0313 14:20:42.813150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerDied","Data":"e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.755195 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.781551 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.783099 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855633 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855817 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.855972 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.856030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.856098 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") pod \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\" (UID: \"ed76acfb-bb3b-47de-ae85-23de2e792e7b\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.866413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk" (OuterVolumeSpecName: "kube-api-access-bh9wk") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "kube-api-access-bh9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.868187 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts" (OuterVolumeSpecName: "scripts") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.870706 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.873749 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.874324 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mc8t9" event={"ID":"ed76acfb-bb3b-47de-ae85-23de2e792e7b","Type":"ContainerDied","Data":"862d777be88dbcc89854ce3dd00093f5671b014a3c866a71f6f944e62cffdd2c"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.874362 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862d777be88dbcc89854ce3dd00093f5671b014a3c866a71f6f944e62cffdd2c" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.874426 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mc8t9" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.885791 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9ea679-73ec-46c5-b3b9-25d63398eb35","Type":"ContainerDied","Data":"55281eef0a29c186bce67ca4ff84ee09f28c634f48016b0a99b76a64d0ade9af"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.885840 4898 scope.go:117] "RemoveContainer" containerID="9e586b7dba7e52cd9e5256f1170f84b5de3fe664830b8456825bc0fe70ad9fed" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.885957 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.893501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c","Type":"ContainerDied","Data":"52d408fc697e9533734dc32479d4f8978a1ff97d35d66aafb1e183a3827a836e"} Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.893577 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.920413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.920479 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.950704 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mc8t9"] Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.953210 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data" (OuterVolumeSpecName: "config-data") pod "ed76acfb-bb3b-47de-ae85-23de2e792e7b" (UID: "ed76acfb-bb3b-47de-ae85-23de2e792e7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.958908 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.958958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959011 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959047 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959090 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959259 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959280 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959295 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959407 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959460 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959516 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959594 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959633 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") pod \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\" (UID: \"7f9ea679-73ec-46c5-b3b9-25d63398eb35\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.959650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") pod \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\" (UID: \"a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c\") " Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960060 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960075 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960087 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960096 4898 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960105 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed76acfb-bb3b-47de-ae85-23de2e792e7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960113 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh9wk\" (UniqueName: \"kubernetes.io/projected/ed76acfb-bb3b-47de-ae85-23de2e792e7b-kube-api-access-bh9wk\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960765 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs" (OuterVolumeSpecName: "logs") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960817 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.960855 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs" (OuterVolumeSpecName: "logs") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.961016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.965663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx" (OuterVolumeSpecName: "kube-api-access-kpxqx") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "kube-api-access-kpxqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.967613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts" (OuterVolumeSpecName: "scripts") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.968786 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts" (OuterVolumeSpecName: "scripts") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.969490 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw" (OuterVolumeSpecName: "kube-api-access-mg4fw") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "kube-api-access-mg4fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:20:44 crc kubenswrapper[4898]: I0313 14:20:44.996479 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (OuterVolumeSpecName: "glance") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.014780 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.017306 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030338 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerName="keystone-bootstrap" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030375 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerName="keystone-bootstrap" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030390 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030400 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030414 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10787742-bffc-4545-95cc-8f0354246d7c" containerName="init" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030422 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="10787742-bffc-4545-95cc-8f0354246d7c" containerName="init" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030441 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030451 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030466 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030473 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: E0313 14:20:45.030487 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030494 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030871 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" containerName="keystone-bootstrap" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030892 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030925 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030944 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="10787742-bffc-4545-95cc-8f0354246d7c" containerName="init" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030953 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" containerName="glance-log" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.030977 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" containerName="glance-httpd" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.031828 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.034884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.040814 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (OuterVolumeSpecName: "glance") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063392 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" " Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063451 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063466 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg4fw\" (UniqueName: \"kubernetes.io/projected/7f9ea679-73ec-46c5-b3b9-25d63398eb35-kube-api-access-mg4fw\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063481 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpxqx\" (UniqueName: \"kubernetes.io/projected/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-kube-api-access-kpxqx\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063492 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063505 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063535 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063545 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063557 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063577 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" " Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.063590 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9ea679-73ec-46c5-b3b9-25d63398eb35-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.079476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.095138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.102223 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.102355 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7") on node "crc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.115441 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.120098 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data" (OuterVolumeSpecName: "config-data") pod "7f9ea679-73ec-46c5-b3b9-25d63398eb35" (UID: "7f9ea679-73ec-46c5-b3b9-25d63398eb35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.121947 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.122154 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3") on node "crc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.140882 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data" (OuterVolumeSpecName: "config-data") pod "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" (UID: "a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164763 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164801 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164819 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.164995 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165143 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165189 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165229 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165243 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165258 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165268 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9ea679-73ec-46c5-b3b9-25d63398eb35-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.165279 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.252731 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267255 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267362 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267431 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267533 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.267608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.272884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.274530 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.279513 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.280004 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.297516 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.297654 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.306038 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.324656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"keystone-bootstrap-ljct7\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.328287 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.339131 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.341474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.344785 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-t4f8x" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.345030 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.345292 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.345560 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.352517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.366165 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.372205 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.374665 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.374879 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.382187 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.436219 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472922 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.472968 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473040 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473067 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473103 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473125 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473157 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473245 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.473286 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574558 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574644 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574815 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574918 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574948 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.574987 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575025 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.575828 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.576409 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.577068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.577438 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.580836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581612 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.581936 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.582015 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e32bfaa798492a506ddfd6dd81603c6b252f4a286c98ba8256226389647f45c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.582023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583025 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583048 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583057 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1810628263decbcb8d9790a46f0a2a80fe37ecdd6e2a4c05137bd112c0de5f67/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.583308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.586865 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.596665 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.601246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.626413 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.646116 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.667610 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.696125 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.755080 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9ea679-73ec-46c5-b3b9-25d63398eb35" path="/var/lib/kubelet/pods/7f9ea679-73ec-46c5-b3b9-25d63398eb35/volumes" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.756273 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c" path="/var/lib/kubelet/pods/a92e3d93-9b4b-4e81-84dc-5cf7aa23f96c/volumes" Mar 13 14:20:45 crc kubenswrapper[4898]: I0313 14:20:45.756820 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed76acfb-bb3b-47de-ae85-23de2e792e7b" path="/var/lib/kubelet/pods/ed76acfb-bb3b-47de-ae85-23de2e792e7b/volumes" Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.523788 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.607680 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.607977 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" containerID="cri-o://ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3" gracePeriod=10 Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.930043 4898 generic.go:334] "Generic (PLEG): container finished" podID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerID="ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3" exitCode=0 Mar 13 14:20:46 crc kubenswrapper[4898]: I0313 14:20:46.930085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerDied","Data":"ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3"} Mar 13 14:20:51 crc kubenswrapper[4898]: I0313 14:20:51.002417 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:51 crc kubenswrapper[4898]: I0313 14:20:51.007816 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:52 crc kubenswrapper[4898]: I0313 14:20:52.027603 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 14:20:54 crc kubenswrapper[4898]: I0313 14:20:54.086786 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:20:56 crc kubenswrapper[4898]: E0313 14:20:56.436067 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 14:20:56 crc kubenswrapper[4898]: E0313 14:20:56.436621 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cwfsv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xq6ss_openstack(ac704482-c7a4-471c-b3c1-d1fdd7e0eb83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:20:56 crc kubenswrapper[4898]: E0313 14:20:56.437802 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xq6ss" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" Mar 13 14:20:57 crc kubenswrapper[4898]: E0313 14:20:57.096213 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xq6ss" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" Mar 13 14:20:59 crc kubenswrapper[4898]: I0313 14:20:59.088288 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:21:03 crc kubenswrapper[4898]: I0313 14:21:03.156313 4898 generic.go:334] "Generic (PLEG): container finished" podID="664deedc-3946-4205-98ad-21759d35d952" containerID="d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2" exitCode=0 Mar 13 14:21:03 crc kubenswrapper[4898]: I0313 14:21:03.156881 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerDied","Data":"d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2"} Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.089308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.089553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.588288 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677321 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677420 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677513 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677580 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677604 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.677659 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") pod \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\" (UID: \"9f1520e0-d7d9-4992-9ca5-1b2e98313d33\") " Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.685094 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg" (OuterVolumeSpecName: "kube-api-access-9dpzg") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "kube-api-access-9dpzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.734955 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.736739 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.749661 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.751943 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.763807 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config" (OuterVolumeSpecName: "config") pod "9f1520e0-d7d9-4992-9ca5-1b2e98313d33" (UID: "9f1520e0-d7d9-4992-9ca5-1b2e98313d33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781623 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781715 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781735 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dpzg\" (UniqueName: \"kubernetes.io/projected/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-kube-api-access-9dpzg\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781792 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781817 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:04 crc kubenswrapper[4898]: I0313 14:21:04.781834 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f1520e0-d7d9-4992-9ca5-1b2e98313d33-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.018281 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.018501 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nf6h584h58h5b7hbfh547hbbh5bfh5dfh686hfdhcch67dh648hddh75h58h9bhc8h654h6dh686h96h658h5f4h5f9hbbh667hb9hd4hf6h666q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ffnxt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(247749ae-204b-4e9c-ad1c-f5d924b6f211): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.180882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" event={"ID":"9f1520e0-d7d9-4992-9ca5-1b2e98313d33","Type":"ContainerDied","Data":"e9c150eab9ccbad529dce767553bfd6f06b4b8420400e4a2a3ec291dc3f4b819"} Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.180970 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.222142 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.233782 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-cz6vf"] Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.345257 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.345717 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-28mft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-zgt75_openstack(84a7fd24-4320-4c0e-8ded-0d455252a549): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:21:05 crc kubenswrapper[4898]: E0313 14:21:05.347096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-zgt75" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.364252 4898 scope.go:117] "RemoveContainer" containerID="5d3511c79cb347abef153850ae05dbbda4e475dc7ccff6ee262eef01f5d5c250" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.378558 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.499215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") pod \"664deedc-3946-4205-98ad-21759d35d952\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.499279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") pod \"664deedc-3946-4205-98ad-21759d35d952\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.499334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") pod \"664deedc-3946-4205-98ad-21759d35d952\" (UID: \"664deedc-3946-4205-98ad-21759d35d952\") " Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.502920 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj" (OuterVolumeSpecName: "kube-api-access-s2nsj") pod "664deedc-3946-4205-98ad-21759d35d952" (UID: "664deedc-3946-4205-98ad-21759d35d952"). InnerVolumeSpecName "kube-api-access-s2nsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.526702 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "664deedc-3946-4205-98ad-21759d35d952" (UID: "664deedc-3946-4205-98ad-21759d35d952"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.535364 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config" (OuterVolumeSpecName: "config") pod "664deedc-3946-4205-98ad-21759d35d952" (UID: "664deedc-3946-4205-98ad-21759d35d952"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.602458 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.602496 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2nsj\" (UniqueName: \"kubernetes.io/projected/664deedc-3946-4205-98ad-21759d35d952-kube-api-access-s2nsj\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.602508 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/664deedc-3946-4205-98ad-21759d35d952-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:05 crc kubenswrapper[4898]: I0313 14:21:05.757646 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" path="/var/lib/kubelet/pods/9f1520e0-d7d9-4992-9ca5-1b2e98313d33/volumes" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.201006 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-hm77q" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.200987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-hm77q" event={"ID":"664deedc-3946-4205-98ad-21759d35d952","Type":"ContainerDied","Data":"6d9f47030df3927c7cd3d661b2414ebcd6b45bbd0a337a5460ac32062ae1c8cc"} Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.201394 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9f47030df3927c7cd3d661b2414ebcd6b45bbd0a337a5460ac32062ae1c8cc" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.204014 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-zgt75" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.450258 4898 scope.go:117] "RemoveContainer" containerID="dfc29ad43e0c709dcb1ef1785b269ec83a25e185502f0001b9412c1b48560e60" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.506049 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.506482 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82hs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-ztp6c_openstack(193b05da-acb9-4512-a2ae-6c03450e6f05): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.509146 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-ztp6c" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.593794 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.594243 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594263 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.594287 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="init" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594294 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="init" Mar 13 14:21:06 crc kubenswrapper[4898]: E0313 14:21:06.594333 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664deedc-3946-4205-98ad-21759d35d952" containerName="neutron-db-sync" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594339 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="664deedc-3946-4205-98ad-21759d35d952" containerName="neutron-db-sync" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594532 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="664deedc-3946-4205-98ad-21759d35d952" containerName="neutron-db-sync" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.594551 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.602517 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.625728 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.693304 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.695514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.699928 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.700005 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-db2fv" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.700304 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.700789 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.705941 4898 scope.go:117] "RemoveContainer" containerID="0df5bf75681ada8dd8a40c022de81b8eec9f36333dccaa2326ce28596ea32dd8" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.722788 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.746630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.746771 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747277 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747340 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747453 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.747495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.767644 4898 scope.go:117] "RemoveContainer" containerID="ce178a8ab6289ecc30ada8030035df3e75b119b24ae01060c42a05e394597ae3" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.822881 4898 scope.go:117] "RemoveContainer" containerID="d7d9b7775bd2555a7c4636350359292fb8c65661ac123c18de4ec1ec3c6ad5d5" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849496 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849582 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849601 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849644 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849666 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849739 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849842 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.849863 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.852995 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.854015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.854015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.855208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.857397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.872913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"dnsmasq-dns-55f844cf75-7gvqf\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952472 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952508 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952568 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.952664 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.953581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.958031 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.959692 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.961132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:06 crc kubenswrapper[4898]: I0313 14:21:06.971269 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:06.999525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"neutron-f97c64464-wmnph\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.023869 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.285593 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerStarted","Data":"27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49"} Mar 13 14:21:07 crc kubenswrapper[4898]: E0313 14:21:07.290926 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-ztp6c" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.327957 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-dddqm" podStartSLOduration=4.672014055 podStartE2EDuration="32.327942241s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.69029243 +0000 UTC m=+1472.691880669" lastFinishedPulling="2026-03-13 14:21:05.346220616 +0000 UTC m=+1500.347808855" observedRunningTime="2026-03-13 14:21:07.327746385 +0000 UTC m=+1502.329334644" watchObservedRunningTime="2026-03-13 14:21:07.327942241 +0000 UTC m=+1502.329530480" Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.407370 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:21:07 crc kubenswrapper[4898]: W0313 14:21:07.432537 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f68a4dd_fec8_4e60_a89c_69ce09fc5700.slice/crio-b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574 WatchSource:0}: Error finding container b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574: Status 404 returned error can't find the container with id b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574 Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.517033 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.611531 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.654170 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:07 crc kubenswrapper[4898]: I0313 14:21:07.974253 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:21:08 crc kubenswrapper[4898]: W0313 14:21:08.129522 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61f1f8bf_63eb_464c_9703_3d3db80ba0df.slice/crio-0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf WatchSource:0}: Error finding container 0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf: Status 404 returned error can't find the container with id 0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf Mar 13 14:21:08 crc kubenswrapper[4898]: W0313 14:21:08.131705 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8312dc9_a2b4_4ee6_b34f_cb984c14ad21.slice/crio-8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37 WatchSource:0}: Error finding container 8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37: Status 404 returned error can't find the container with id 8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37 Mar 13 14:21:08 crc kubenswrapper[4898]: W0313 14:21:08.132968 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda9c8289_b4cc_4259_a94e_fab15f437c67.slice/crio-2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade WatchSource:0}: Error finding container 2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade: Status 404 returned error can't find the container with id 2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.334133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerStarted","Data":"8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.353685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerStarted","Data":"0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.356859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerStarted","Data":"2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.358378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerStarted","Data":"b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574"} Mar 13 14:21:08 crc kubenswrapper[4898]: I0313 14:21:08.374621 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerStarted","Data":"ddd1664b14e1ff4c8657d63bc705f6e2cc8530fd54bcfec783c314238117e1e0"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.089777 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-cz6vf" podUID="9f1520e0-d7d9-4992-9ca5-1b2e98313d33" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.176:5353: i/o timeout" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.161330 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.164502 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.171073 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.171213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.171400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.196425 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.258635 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.260959 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.268543 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.268713 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275790 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.275941 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276005 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276181 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.276238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.277922 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.278065 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.294227 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.300793 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"redhat-operators-z7ldc\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.379858 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380114 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380640 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.380696 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.386485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.392285 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.395083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.397080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.398226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.398671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.405753 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"neutron-8495ffcdcc-j7d29\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.417613 4898 generic.go:334] "Generic (PLEG): container finished" podID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerID="ebec6588ea54fc0e12abfc618cba17e32b0384e26af2c4dc5438fd6e04229c34" exitCode=0 Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.417681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerDied","Data":"ebec6588ea54fc0e12abfc618cba17e32b0384e26af2c4dc5438fd6e04229c34"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.425103 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerStarted","Data":"ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.447668 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.457311 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerStarted","Data":"3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.459245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerStarted","Data":"e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.477082 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerStarted","Data":"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.477127 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerStarted","Data":"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded"} Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.477991 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.491404 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ljct7" podStartSLOduration=25.491388752 podStartE2EDuration="25.491388752s" podCreationTimestamp="2026-03-13 14:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:09.485413157 +0000 UTC m=+1504.487001406" watchObservedRunningTime="2026-03-13 14:21:09.491388752 +0000 UTC m=+1504.492976991" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.520163 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.581282 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f97c64464-wmnph" podStartSLOduration=3.581256105 podStartE2EDuration="3.581256105s" podCreationTimestamp="2026-03-13 14:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:09.556469572 +0000 UTC m=+1504.558057821" watchObservedRunningTime="2026-03-13 14:21:09.581256105 +0000 UTC m=+1504.582844344" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.607831 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:09 crc kubenswrapper[4898]: I0313 14:21:09.818466 4898 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod10787742-bffc-4545-95cc-8f0354246d7c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod10787742-bffc-4545-95cc-8f0354246d7c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod10787742_bffc_4545_95cc_8f0354246d7c.slice" Mar 13 14:21:09 crc kubenswrapper[4898]: E0313 14:21:09.818506 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod10787742-bffc-4545-95cc-8f0354246d7c] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod10787742-bffc-4545-95cc-8f0354246d7c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod10787742_bffc_4545_95cc_8f0354246d7c.slice" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" podUID="10787742-bffc-4545-95cc-8f0354246d7c" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.328252 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:21:10 crc kubenswrapper[4898]: W0313 14:21:10.355517 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb38f3681_6f2f_437f_9694_810d43921aa2.slice/crio-14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24 WatchSource:0}: Error finding container 14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24: Status 404 returned error can't find the container with id 14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24 Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.505102 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerStarted","Data":"ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.522934 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.545742 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" podStartSLOduration=4.545718503 podStartE2EDuration="4.545718503s" podCreationTimestamp="2026-03-13 14:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:10.522818398 +0000 UTC m=+1505.524406637" watchObservedRunningTime="2026-03-13 14:21:10.545718503 +0000 UTC m=+1505.547306742" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.558486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerStarted","Data":"53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.571140 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.572222 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerStarted","Data":"14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.595005 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.594987401 podStartE2EDuration="25.594987401s" podCreationTimestamp="2026-03-13 14:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:10.58759576 +0000 UTC m=+1505.589184019" watchObservedRunningTime="2026-03-13 14:21:10.594987401 +0000 UTC m=+1505.596575640" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.603290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-jcdtp" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.604456 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerStarted","Data":"58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f"} Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.647147 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=25.647123115 podStartE2EDuration="25.647123115s" podCreationTimestamp="2026-03-13 14:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:10.635451612 +0000 UTC m=+1505.637039871" watchObservedRunningTime="2026-03-13 14:21:10.647123115 +0000 UTC m=+1505.648711354" Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.861428 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:21:10 crc kubenswrapper[4898]: I0313 14:21:10.897385 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-jcdtp"] Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.615713 4898 generic.go:334] "Generic (PLEG): container finished" podID="b38f3681-6f2f-437f-9694-810d43921aa2" containerID="d6ed263f1fe660123646c8c6128f780dbe747c9b3a543fa08475d3acfc1517d5" exitCode=0 Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.615801 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"d6ed263f1fe660123646c8c6128f780dbe747c9b3a543fa08475d3acfc1517d5"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerStarted","Data":"0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618264 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerStarted","Data":"0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618274 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerStarted","Data":"675cba3392edee8aa8b36b03aeac2453edffb374fe9e8c521c269c0464cb1478"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.618379 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.619764 4898 generic.go:334] "Generic (PLEG): container finished" podID="51a3e0c5-0084-4216-a162-3614eafcc162" containerID="27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49" exitCode=0 Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.620218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerDied","Data":"27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49"} Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.669954 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8495ffcdcc-j7d29" podStartSLOduration=2.669626349 podStartE2EDuration="2.669626349s" podCreationTimestamp="2026-03-13 14:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:11.65426425 +0000 UTC m=+1506.655852489" watchObservedRunningTime="2026-03-13 14:21:11.669626349 +0000 UTC m=+1506.671214588" Mar 13 14:21:11 crc kubenswrapper[4898]: I0313 14:21:11.771135 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10787742-bffc-4545-95cc-8f0354246d7c" path="/var/lib/kubelet/pods/10787742-bffc-4545-95cc-8f0354246d7c/volumes" Mar 13 14:21:12 crc kubenswrapper[4898]: I0313 14:21:12.634454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerStarted","Data":"e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd"} Mar 13 14:21:12 crc kubenswrapper[4898]: I0313 14:21:12.655079 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xq6ss" podStartSLOduration=4.005064622 podStartE2EDuration="37.655060551s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.718328168 +0000 UTC m=+1472.719916407" lastFinishedPulling="2026-03-13 14:21:11.368324107 +0000 UTC m=+1506.369912336" observedRunningTime="2026-03-13 14:21:12.652283228 +0000 UTC m=+1507.653871477" watchObservedRunningTime="2026-03-13 14:21:12.655060551 +0000 UTC m=+1507.656648800" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.529845 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.609671 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.609746 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610113 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs" (OuterVolumeSpecName: "logs") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.610263 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") pod \"51a3e0c5-0084-4216-a162-3614eafcc162\" (UID: \"51a3e0c5-0084-4216-a162-3614eafcc162\") " Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.611033 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51a3e0c5-0084-4216-a162-3614eafcc162-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.616598 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts" (OuterVolumeSpecName: "scripts") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.622088 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255" (OuterVolumeSpecName: "kube-api-access-st255") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "kube-api-access-st255". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.649574 4898 generic.go:334] "Generic (PLEG): container finished" podID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerID="ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13" exitCode=0 Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.649661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerDied","Data":"ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13"} Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.651776 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.660789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-dddqm" event={"ID":"51a3e0c5-0084-4216-a162-3614eafcc162","Type":"ContainerDied","Data":"4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d"} Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.660837 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4656230f194edb795af2740a5c0bb83bbde4d4a8b2fd3cee4accb6bdc572ea5d" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.661179 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-dddqm" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.672338 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data" (OuterVolumeSpecName: "config-data") pod "51a3e0c5-0084-4216-a162-3614eafcc162" (UID: "51a3e0c5-0084-4216-a162-3614eafcc162"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723517 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st255\" (UniqueName: \"kubernetes.io/projected/51a3e0c5-0084-4216-a162-3614eafcc162-kube-api-access-st255\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723552 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723562 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.723573 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51a3e0c5-0084-4216-a162-3614eafcc162-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.844037 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:13 crc kubenswrapper[4898]: E0313 14:21:13.844589 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" containerName="placement-db-sync" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.844610 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" containerName="placement-db-sync" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.844806 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" containerName="placement-db-sync" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.846056 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.850378 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.850624 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.876353 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.935765 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936060 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936240 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936286 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936306 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:13 crc kubenswrapper[4898]: I0313 14:21:13.936528 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.037891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.037964 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038053 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038103 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.038139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.039063 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.043800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.045008 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.045488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.045712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.056161 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.057411 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"placement-5bf5d8b7d4-4gwxr\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:14 crc kubenswrapper[4898]: I0313 14:21:14.173421 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.668675 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.669020 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.669035 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.669047 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.682036 4898 generic.go:334] "Generic (PLEG): container finished" podID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerID="e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd" exitCode=0 Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.682121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerDied","Data":"e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd"} Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.684407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ljct7" event={"ID":"0f68a4dd-fec8-4e60-a89c-69ce09fc5700","Type":"ContainerDied","Data":"b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574"} Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.684456 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5807007919517093647e5507bfed140e507af6921560a0e8eff012955dab574" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.703952 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.704457 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.704473 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.704486 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.736496 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.736970 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.776705 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.776929 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.928077 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989435 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989557 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989645 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989716 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989747 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.989770 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") pod \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\" (UID: \"0f68a4dd-fec8-4e60-a89c-69ce09fc5700\") " Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.995341 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.997084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.997095 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v" (OuterVolumeSpecName: "kube-api-access-j9s6v") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "kube-api-access-j9s6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:15 crc kubenswrapper[4898]: I0313 14:21:15.997111 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts" (OuterVolumeSpecName: "scripts") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.067613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.078863 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data" (OuterVolumeSpecName: "config-data") pod "0f68a4dd-fec8-4e60-a89c-69ce09fc5700" (UID: "0f68a4dd-fec8-4e60-a89c-69ce09fc5700"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092069 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092100 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s6v\" (UniqueName: \"kubernetes.io/projected/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-kube-api-access-j9s6v\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092111 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092120 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092128 4898 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.092136 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0f68a4dd-fec8-4e60-a89c-69ce09fc5700-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:16 crc kubenswrapper[4898]: W0313 14:21:16.238942 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod604a0205_6c18_4bff_929f_038524d62aeb.slice/crio-762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911 WatchSource:0}: Error finding container 762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911: Status 404 returned error can't find the container with id 762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911 Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.245319 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.699262 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.702061 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerStarted","Data":"7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.702201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerStarted","Data":"762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.705660 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerStarted","Data":"c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50"} Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.706142 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ljct7" Mar 13 14:21:16 crc kubenswrapper[4898]: I0313 14:21:16.955057 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.043427 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.043696 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" containerID="cri-o://b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5" gracePeriod=10 Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.145106 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-87574c74-kqmjb"] Mar 13 14:21:17 crc kubenswrapper[4898]: E0313 14:21:17.145548 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerName="keystone-bootstrap" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.145560 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerName="keystone-bootstrap" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.145772 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" containerName="keystone-bootstrap" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.146474 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152310 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152366 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-tdc5n" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152783 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.152883 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.153891 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87574c74-kqmjb"] Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.154862 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-credential-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233365 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-fernet-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233411 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-internal-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233432 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-config-data\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233471 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4m2t\" (UniqueName: \"kubernetes.io/projected/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-kube-api-access-l4m2t\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233499 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-scripts\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-public-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.233543 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-combined-ca-bundle\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.258883 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.334429 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") pod \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.334605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") pod \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.334736 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") pod \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\" (UID: \"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83\") " Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-credential-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335048 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-fernet-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335097 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-internal-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-config-data\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4m2t\" (UniqueName: \"kubernetes.io/projected/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-kube-api-access-l4m2t\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335191 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-scripts\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335209 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-public-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.335237 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-combined-ca-bundle\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.356735 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-public-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.366518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4m2t\" (UniqueName: \"kubernetes.io/projected/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-kube-api-access-l4m2t\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.378492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-scripts\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.379761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-config-data\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.381359 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-fernet-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.381610 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-combined-ca-bundle\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.382104 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" (UID: "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.382279 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv" (OuterVolumeSpecName: "kube-api-access-cwfsv") pod "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" (UID: "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83"). InnerVolumeSpecName "kube-api-access-cwfsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.386456 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-internal-tls-certs\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.390401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d149c7e3-df46-44b5-8a66-8a0fbb5a8554-credential-keys\") pod \"keystone-87574c74-kqmjb\" (UID: \"d149c7e3-df46-44b5-8a66-8a0fbb5a8554\") " pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.437582 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwfsv\" (UniqueName: \"kubernetes.io/projected/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-kube-api-access-cwfsv\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.437828 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.451093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" (UID: "ac704482-c7a4-471c-b3c1-d1fdd7e0eb83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.540276 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.571739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.720051 4898 generic.go:334] "Generic (PLEG): container finished" podID="b38f3681-6f2f-437f-9694-810d43921aa2" containerID="c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50" exitCode=0 Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.721669 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.742241 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xq6ss" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.761062 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xq6ss" event={"ID":"ac704482-c7a4-471c-b3c1-d1fdd7e0eb83","Type":"ContainerDied","Data":"bafefd0ee86b1967f73e5ee3d1256b9f0f1d84430cc94cf6628cfc6e827e8aad"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.761105 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bafefd0ee86b1967f73e5ee3d1256b9f0f1d84430cc94cf6628cfc6e827e8aad" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.785076 4898 generic.go:334] "Generic (PLEG): container finished" podID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerID="b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5" exitCode=0 Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.785167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerDied","Data":"b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.802838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerStarted","Data":"974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819"} Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.803467 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.803508 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.932280 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bf5d8b7d4-4gwxr" podStartSLOduration=4.932248543 podStartE2EDuration="4.932248543s" podCreationTimestamp="2026-03-13 14:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:17.892769448 +0000 UTC m=+1512.894357697" watchObservedRunningTime="2026-03-13 14:21:17.932248543 +0000 UTC m=+1512.933836782" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.950485 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-795749dc8c-sm2hl"] Mar 13 14:21:17 crc kubenswrapper[4898]: E0313 14:21:17.950962 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerName="barbican-db-sync" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.950978 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerName="barbican-db-sync" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.951181 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" containerName="barbican-db-sync" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.954648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.959276 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.959499 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rkpbr" Mar 13 14:21:17 crc kubenswrapper[4898]: I0313 14:21:17.964662 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.001009 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-795749dc8c-sm2hl"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069385 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b16e588-d353-4100-b143-b84420c42e30-logs\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-combined-ca-bundle\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data-custom\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069625 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.069647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f995q\" (UniqueName: \"kubernetes.io/projected/8b16e588-d353-4100-b143-b84420c42e30-kube-api-access-f995q\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.090211 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-fcdc98bd8-xdl6x"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.092085 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.095231 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.111091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fcdc98bd8-xdl6x"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.130943 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.133102 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.141135 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172172 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk59q\" (UniqueName: \"kubernetes.io/projected/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-kube-api-access-gk59q\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172236 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b16e588-d353-4100-b143-b84420c42e30-logs\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-combined-ca-bundle\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172278 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-logs\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172299 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data-custom\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172315 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data-custom\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.172506 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f995q\" (UniqueName: \"kubernetes.io/projected/8b16e588-d353-4100-b143-b84420c42e30-kube-api-access-f995q\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.173215 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b16e588-d353-4100-b143-b84420c42e30-logs\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.181527 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-combined-ca-bundle\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.189555 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data-custom\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.203603 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b16e588-d353-4100-b143-b84420c42e30-config-data\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.218481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f995q\" (UniqueName: \"kubernetes.io/projected/8b16e588-d353-4100-b143-b84420c42e30-kube-api-access-f995q\") pod \"barbican-worker-795749dc8c-sm2hl\" (UID: \"8b16e588-d353-4100-b143-b84420c42e30\") " pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276321 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk59q\" (UniqueName: \"kubernetes.io/projected/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-kube-api-access-gk59q\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-logs\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276559 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data-custom\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276575 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276612 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276658 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276719 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.276851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-logs\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.282191 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.286552 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.288449 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data-custom\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.297960 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.300434 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.312545 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-795749dc8c-sm2hl" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.321277 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-combined-ca-bundle\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.321749 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-config-data\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.334748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk59q\" (UniqueName: \"kubernetes.io/projected/272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2-kube-api-access-gk59q\") pod \"barbican-keystone-listener-fcdc98bd8-xdl6x\" (UID: \"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2\") " pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.383932 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.383994 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384086 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384173 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384205 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384250 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384269 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384288 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.384321 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.385289 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.385562 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.385859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.386424 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.388253 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.410566 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"dnsmasq-dns-85ff748b95-8d6wc\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.425277 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-87574c74-kqmjb"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.433783 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.467748 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488389 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488443 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488585 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.488825 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.489030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.489321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.493141 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.494691 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.495181 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.506693 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"barbican-api-d74d977fd-v5m5s\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.514520 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590068 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590286 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590383 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590433 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.590480 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") pod \"04642207-fab0-47bf-9ac4-030bbe91b4f0\" (UID: \"04642207-fab0-47bf-9ac4-030bbe91b4f0\") " Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.603102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r" (OuterVolumeSpecName: "kube-api-access-cqg7r") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "kube-api-access-cqg7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.693340 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqg7r\" (UniqueName: \"kubernetes.io/projected/04642207-fab0-47bf-9ac4-030bbe91b4f0-kube-api-access-cqg7r\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.702043 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.703373 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.716313 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.764965 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.783888 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795386 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795416 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795428 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.795437 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.809403 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config" (OuterVolumeSpecName: "config") pod "04642207-fab0-47bf-9ac4-030bbe91b4f0" (UID: "04642207-fab0-47bf-9ac4-030bbe91b4f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.848108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" event={"ID":"04642207-fab0-47bf-9ac4-030bbe91b4f0","Type":"ContainerDied","Data":"1f8aa6f56c769252ca6f9fa29c34832b03b0bd31e4320e8506b18e27d01c86a7"} Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.848173 4898 scope.go:117] "RemoveContainer" containerID="b87af33e145b04333fd2674b6cd7ae8ffea23dd169547cec058f2a13473c72c5" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.848339 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-fbs4f" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.860560 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87574c74-kqmjb" event={"ID":"d149c7e3-df46-44b5-8a66-8a0fbb5a8554","Type":"ContainerStarted","Data":"324d55849ecdcac9eb7e6f876adb4165c9370aa4cdec80c831e30064c20235df"} Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.898639 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04642207-fab0-47bf-9ac4-030bbe91b4f0-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.933352 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-795749dc8c-sm2hl"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.946101 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.956196 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-fbs4f"] Mar 13 14:21:18 crc kubenswrapper[4898]: I0313 14:21:18.962427 4898 scope.go:117] "RemoveContainer" containerID="a566cd3c2104d4cf2d4de94c8ef5556830e27cbf9385397fa0c4b4a48c1b947c" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.134475 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.134738 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.246690 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-fcdc98bd8-xdl6x"] Mar 13 14:21:19 crc kubenswrapper[4898]: W0313 14:21:19.304793 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272aa2e8_f1ed_4a08_b5a3_aecd06c4c6d2.slice/crio-c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7 WatchSource:0}: Error finding container c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7: Status 404 returned error can't find the container with id c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7 Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.340750 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:19 crc kubenswrapper[4898]: W0313 14:21:19.378151 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45af301d_29c9_474d_be0d_4d91f6d0cb18.slice/crio-54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec WatchSource:0}: Error finding container 54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec: Status 404 returned error can't find the container with id 54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.590479 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:19 crc kubenswrapper[4898]: W0313 14:21:19.605744 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dcea9de_db8a_42dd_958c_59df43a49ff3.slice/crio-704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29 WatchSource:0}: Error finding container 704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29: Status 404 returned error can't find the container with id 704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29 Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.761729 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" path="/var/lib/kubelet/pods/04642207-fab0-47bf-9ac4-030bbe91b4f0/volumes" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.888609 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-795749dc8c-sm2hl" event={"ID":"8b16e588-d353-4100-b143-b84420c42e30","Type":"ContainerStarted","Data":"84f09917c0d799d211cb85a17e721fb1eb7ed97fe7a2778a5684cc54c1a89bb8"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.897742 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerStarted","Data":"704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.900133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerStarted","Data":"54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.903945 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" event={"ID":"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2","Type":"ContainerStarted","Data":"c487f36110bc4a44edbb608c4532dc8ca0f578132086cb593f98d555564c06a7"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.908210 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-87574c74-kqmjb" event={"ID":"d149c7e3-df46-44b5-8a66-8a0fbb5a8554","Type":"ContainerStarted","Data":"f93b65d81f70cb8f8e5f58a2832c1ab7b3a3ee434167e6ca4438f287c2a4218e"} Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.908503 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:19 crc kubenswrapper[4898]: I0313 14:21:19.936236 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-87574c74-kqmjb" podStartSLOduration=2.936214345 podStartE2EDuration="2.936214345s" podCreationTimestamp="2026-03-13 14:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:19.925188019 +0000 UTC m=+1514.926776258" watchObservedRunningTime="2026-03-13 14:21:19.936214345 +0000 UTC m=+1514.937802584" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.168060 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b9dc95d4b-bvhlz"] Mar 13 14:21:21 crc kubenswrapper[4898]: E0313 14:21:21.169307 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="init" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.169329 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="init" Mar 13 14:21:21 crc kubenswrapper[4898]: E0313 14:21:21.169385 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.169393 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.169649 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="04642207-fab0-47bf-9ac4-030bbe91b4f0" containerName="dnsmasq-dns" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.171361 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.173882 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.174987 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.194069 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b9dc95d4b-bvhlz"] Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4bf680-c8b7-4721-9595-9a8ed40410d2-logs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268404 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hn49\" (UniqueName: \"kubernetes.io/projected/fd4bf680-c8b7-4721-9595-9a8ed40410d2-kube-api-access-5hn49\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-internal-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268495 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-public-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268570 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268610 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data-custom\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.268748 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-combined-ca-bundle\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hn49\" (UniqueName: \"kubernetes.io/projected/fd4bf680-c8b7-4721-9595-9a8ed40410d2-kube-api-access-5hn49\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-internal-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-public-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data-custom\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.370947 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-combined-ca-bundle\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.371046 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4bf680-c8b7-4721-9595-9a8ed40410d2-logs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.371635 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4bf680-c8b7-4721-9595-9a8ed40410d2-logs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.377300 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-combined-ca-bundle\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.377600 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-public-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.377634 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.378047 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-internal-tls-certs\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.379798 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd4bf680-c8b7-4721-9595-9a8ed40410d2-config-data-custom\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.393278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hn49\" (UniqueName: \"kubernetes.io/projected/fd4bf680-c8b7-4721-9595-9a8ed40410d2-kube-api-access-5hn49\") pod \"barbican-api-b9dc95d4b-bvhlz\" (UID: \"fd4bf680-c8b7-4721-9595-9a8ed40410d2\") " pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:21 crc kubenswrapper[4898]: I0313 14:21:21.494250 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.962051 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b9dc95d4b-bvhlz"] Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.967181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerStarted","Data":"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221"} Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.969235 4898 generic.go:334] "Generic (PLEG): container finished" podID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerID="686c9d5260a554140660fb899d995f27d4b2bd420d76c710ada3057e3122cfaf" exitCode=0 Mar 13 14:21:22 crc kubenswrapper[4898]: I0313 14:21:22.969287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerDied","Data":"686c9d5260a554140660fb899d995f27d4b2bd420d76c710ada3057e3122cfaf"} Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.231827 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.234805 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.243251 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:21:25 crc kubenswrapper[4898]: I0313 14:21:25.255471 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:21:28 crc kubenswrapper[4898]: I0313 14:21:28.036872 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerStarted","Data":"3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a"} Mar 13 14:21:28 crc kubenswrapper[4898]: I0313 14:21:28.067941 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7ldc" podStartSLOduration=8.270604717 podStartE2EDuration="19.06792003s" podCreationTimestamp="2026-03-13 14:21:09 +0000 UTC" firstStartedPulling="2026-03-13 14:21:11.620929625 +0000 UTC m=+1506.622517864" lastFinishedPulling="2026-03-13 14:21:22.418244948 +0000 UTC m=+1517.419833177" observedRunningTime="2026-03-13 14:21:28.0636582 +0000 UTC m=+1523.065246459" watchObservedRunningTime="2026-03-13 14:21:28.06792003 +0000 UTC m=+1523.069508269" Mar 13 14:21:28 crc kubenswrapper[4898]: W0313 14:21:28.640954 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd4bf680_c8b7_4721_9595_9a8ed40410d2.slice/crio-19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019 WatchSource:0}: Error finding container 19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019: Status 404 returned error can't find the container with id 19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019 Mar 13 14:21:29 crc kubenswrapper[4898]: I0313 14:21:29.053226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b9dc95d4b-bvhlz" event={"ID":"fd4bf680-c8b7-4721-9595-9a8ed40410d2","Type":"ContainerStarted","Data":"19f1a84b67106ffd000b350d027a716a224702ac470d9c975f0fe16d4891e019"} Mar 13 14:21:29 crc kubenswrapper[4898]: I0313 14:21:29.522067 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:29 crc kubenswrapper[4898]: I0313 14:21:29.522426 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:21:30 crc kubenswrapper[4898]: I0313 14:21:30.068915 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerStarted","Data":"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d"} Mar 13 14:21:30 crc kubenswrapper[4898]: I0313 14:21:30.098455 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d74d977fd-v5m5s" podStartSLOduration=12.098435932 podStartE2EDuration="12.098435932s" podCreationTimestamp="2026-03-13 14:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:30.093451213 +0000 UTC m=+1525.095039462" watchObservedRunningTime="2026-03-13 14:21:30.098435932 +0000 UTC m=+1525.100024171" Mar 13 14:21:30 crc kubenswrapper[4898]: E0313 14:21:30.449418 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" Mar 13 14:21:30 crc kubenswrapper[4898]: I0313 14:21:30.598882 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:21:30 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:21:30 crc kubenswrapper[4898]: > Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.082989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b9dc95d4b-bvhlz" event={"ID":"fd4bf680-c8b7-4721-9595-9a8ed40410d2","Type":"ContainerStarted","Data":"c626c10cd8100c4d3677f5d89f5e1aa068944a1acb09f9ce367478b3bbe23a6d"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.083529 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b9dc95d4b-bvhlz" event={"ID":"fd4bf680-c8b7-4721-9595-9a8ed40410d2","Type":"ContainerStarted","Data":"a7d90cf8e037e0421e0a6825fa627d8295d1c48cda92e5ed34783a1fd67ad1e5"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.086015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.086054 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.089344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-795749dc8c-sm2hl" event={"ID":"8b16e588-d353-4100-b143-b84420c42e30","Type":"ContainerStarted","Data":"d22f944ac54c3ae0fe8ec3e0ace87927fe4fb2405e1116807ca8c64712a38e97"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.089423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-795749dc8c-sm2hl" event={"ID":"8b16e588-d353-4100-b143-b84420c42e30","Type":"ContainerStarted","Data":"e2eac0cc0a49aecc401f83d678bdc6912eb6f763fddf368e17e836b67cdbc37a"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.092994 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerStarted","Data":"6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.093168 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.096069 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerStarted","Data":"213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.098829 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerStarted","Data":"c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.101543 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" event={"ID":"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2","Type":"ContainerStarted","Data":"7832e82576d1106230e9821b7a44384ec0152369113aeae90ea86d232dae5b4e"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.101579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" event={"ID":"272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2","Type":"ContainerStarted","Data":"5a677d66116ea7f14947dff0fd6386945680544d68635c360dac0097cbf6589a"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerStarted","Data":"8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87"} Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105707 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105735 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" containerID="cri-o://9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1" gracePeriod=30 Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105822 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" containerID="cri-o://8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87" gracePeriod=30 Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105762 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.105971 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" containerID="cri-o://54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709" gracePeriod=30 Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.106005 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.132368 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b9dc95d4b-bvhlz" podStartSLOduration=10.132339581 podStartE2EDuration="10.132339581s" podCreationTimestamp="2026-03-13 14:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:31.11151777 +0000 UTC m=+1526.113106009" watchObservedRunningTime="2026-03-13 14:21:31.132339581 +0000 UTC m=+1526.133927820" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.156961 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-ztp6c" podStartSLOduration=3.583169798 podStartE2EDuration="56.156925119s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.31855512 +0000 UTC m=+1472.320143369" lastFinishedPulling="2026-03-13 14:21:29.892310441 +0000 UTC m=+1524.893898690" observedRunningTime="2026-03-13 14:21:31.144265 +0000 UTC m=+1526.145853239" watchObservedRunningTime="2026-03-13 14:21:31.156925119 +0000 UTC m=+1526.158513348" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.178761 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" podStartSLOduration=13.178735275 podStartE2EDuration="13.178735275s" podCreationTimestamp="2026-03-13 14:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:31.166418566 +0000 UTC m=+1526.168006805" watchObservedRunningTime="2026-03-13 14:21:31.178735275 +0000 UTC m=+1526.180323514" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.202634 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-zgt75" podStartSLOduration=3.215469283 podStartE2EDuration="56.202609775s" podCreationTimestamp="2026-03-13 14:20:35 +0000 UTC" firstStartedPulling="2026-03-13 14:20:37.161878893 +0000 UTC m=+1472.163467132" lastFinishedPulling="2026-03-13 14:21:30.149019385 +0000 UTC m=+1525.150607624" observedRunningTime="2026-03-13 14:21:31.192706418 +0000 UTC m=+1526.194294677" watchObservedRunningTime="2026-03-13 14:21:31.202609775 +0000 UTC m=+1526.204198014" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.224095 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-795749dc8c-sm2hl" podStartSLOduration=3.313369985 podStartE2EDuration="14.224066532s" podCreationTimestamp="2026-03-13 14:21:17 +0000 UTC" firstStartedPulling="2026-03-13 14:21:19.001986453 +0000 UTC m=+1514.003574692" lastFinishedPulling="2026-03-13 14:21:29.912683 +0000 UTC m=+1524.914271239" observedRunningTime="2026-03-13 14:21:31.214386751 +0000 UTC m=+1526.215974990" watchObservedRunningTime="2026-03-13 14:21:31.224066532 +0000 UTC m=+1526.225654771" Mar 13 14:21:31 crc kubenswrapper[4898]: I0313 14:21:31.290939 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-fcdc98bd8-xdl6x" podStartSLOduration=2.452733054 podStartE2EDuration="13.290850086s" podCreationTimestamp="2026-03-13 14:21:18 +0000 UTC" firstStartedPulling="2026-03-13 14:21:19.32003864 +0000 UTC m=+1514.321626879" lastFinishedPulling="2026-03-13 14:21:30.158155672 +0000 UTC m=+1525.159743911" observedRunningTime="2026-03-13 14:21:31.282721595 +0000 UTC m=+1526.284309834" watchObservedRunningTime="2026-03-13 14:21:31.290850086 +0000 UTC m=+1526.292438345" Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.119945 4898 generic.go:334] "Generic (PLEG): container finished" podID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerID="8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87" exitCode=0 Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.119988 4898 generic.go:334] "Generic (PLEG): container finished" podID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerID="54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709" exitCode=2 Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.119999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87"} Mar 13 14:21:32 crc kubenswrapper[4898]: I0313 14:21:32.120072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709"} Mar 13 14:21:33 crc kubenswrapper[4898]: I0313 14:21:33.134907 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:21:33 crc kubenswrapper[4898]: I0313 14:21:33.588964 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.164505 4898 generic.go:334] "Generic (PLEG): container finished" podID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerID="9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1" exitCode=0 Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.164888 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1"} Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.165497 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"247749ae-204b-4e9c-ad1c-f5d924b6f211","Type":"ContainerDied","Data":"5fd3c6beb630c0a00d78ffbb0eaf96e2717f61918bd632f3618c99bd721d1714"} Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.165525 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fd3c6beb630c0a00d78ffbb0eaf96e2717f61918bd632f3618c99bd721d1714" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.271686 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.339411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.339732 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.339853 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340130 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340317 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") pod \"247749ae-204b-4e9c-ad1c-f5d924b6f211\" (UID: \"247749ae-204b-4e9c-ad1c-f5d924b6f211\") " Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.340332 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.341347 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.345867 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt" (OuterVolumeSpecName: "kube-api-access-ffnxt") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "kube-api-access-ffnxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.361160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts" (OuterVolumeSpecName: "scripts") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.395868 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.425432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444462 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444503 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444513 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444521 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444530 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffnxt\" (UniqueName: \"kubernetes.io/projected/247749ae-204b-4e9c-ad1c-f5d924b6f211-kube-api-access-ffnxt\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.444541 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/247749ae-204b-4e9c-ad1c-f5d924b6f211-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.475784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data" (OuterVolumeSpecName: "config-data") pod "247749ae-204b-4e9c-ad1c-f5d924b6f211" (UID: "247749ae-204b-4e9c-ad1c-f5d924b6f211"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:35 crc kubenswrapper[4898]: I0313 14:21:35.546346 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/247749ae-204b-4e9c-ad1c-f5d924b6f211-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.098557 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.194543 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.298169 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.323953 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.342299 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: E0313 14:21:36.342926 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.342951 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" Mar 13 14:21:36 crc kubenswrapper[4898]: E0313 14:21:36.342990 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.342999 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" Mar 13 14:21:36 crc kubenswrapper[4898]: E0313 14:21:36.343033 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343043 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343296 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="ceilometer-notification-agent" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343333 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="sg-core" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.343364 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" containerName="proxy-httpd" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.345592 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.348385 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.348396 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.359270 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474439 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474538 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474689 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.474728 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576605 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576747 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576781 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576810 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.576874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.577732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.579920 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.583117 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.588653 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.589162 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.589720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.611189 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"ceilometer-0\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " pod="openstack/ceilometer-0" Mar 13 14:21:36 crc kubenswrapper[4898]: I0313 14:21:36.674163 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.049771 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.331694 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.332092 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" containerID="cri-o://0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b" gracePeriod=30 Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.332820 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" containerID="cri-o://0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc" gracePeriod=30 Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.366619 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.396971 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-776df44c77-g64lv"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.400757 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.410923 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-776df44c77-g64lv"] Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.441884 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.205:9696/\": read tcp 10.217.0.2:55216->10.217.0.205:9696: read: connection reset by peer" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516786 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-ovndb-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-httpd-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516910 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-internal-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.516944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-public-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.517036 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-combined-ca-bundle\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.517063 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.517088 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d79cj\" (UniqueName: \"kubernetes.io/projected/4a679fb4-8d85-4835-a048-08c4b61aa158-kube-api-access-d79cj\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619381 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-public-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619529 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-combined-ca-bundle\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619602 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d79cj\" (UniqueName: \"kubernetes.io/projected/4a679fb4-8d85-4835-a048-08c4b61aa158-kube-api-access-d79cj\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619665 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-ovndb-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-httpd-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.619741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-internal-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.628881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-internal-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.630558 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-ovndb-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.631397 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-combined-ca-bundle\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.631731 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-public-tls-certs\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.633072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-httpd-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.633646 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4a679fb4-8d85-4835-a048-08c4b61aa158-config\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.647874 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d79cj\" (UniqueName: \"kubernetes.io/projected/4a679fb4-8d85-4835-a048-08c4b61aa158-kube-api-access-d79cj\") pod \"neutron-776df44c77-g64lv\" (UID: \"4a679fb4-8d85-4835-a048-08c4b61aa158\") " pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.763809 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:37 crc kubenswrapper[4898]: I0313 14:21:37.764475 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="247749ae-204b-4e9c-ad1c-f5d924b6f211" path="/var/lib/kubelet/pods/247749ae-204b-4e9c-ad1c-f5d924b6f211/volumes" Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.256062 4898 generic.go:334] "Generic (PLEG): container finished" podID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerID="0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc" exitCode=0 Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.256127 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerDied","Data":"0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc"} Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.278180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"2b43c112fe6b642ffc81d63835d5208293191491d23e2c20e5ef660540956b7c"} Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.280987 4898 generic.go:334] "Generic (PLEG): container finished" podID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerID="213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8" exitCode=0 Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.281053 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerDied","Data":"213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8"} Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.472209 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.655629 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.666115 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" containerID="cri-o://ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8" gracePeriod=10 Mar 13 14:21:38 crc kubenswrapper[4898]: I0313 14:21:38.700977 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-776df44c77-g64lv"] Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.322075 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776df44c77-g64lv" event={"ID":"4a679fb4-8d85-4835-a048-08c4b61aa158","Type":"ContainerStarted","Data":"de506e417eaaea0c36429e77dfadc3b7be94f0b45c620428caf6a91bf38f1094"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.322690 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776df44c77-g64lv" event={"ID":"4a679fb4-8d85-4835-a048-08c4b61aa158","Type":"ContainerStarted","Data":"b15282fd3af93bf6b4e8c3ab1dabdabfdbf27142041e4ad1d14fe6b0ae8e170d"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.347318 4898 generic.go:334] "Generic (PLEG): container finished" podID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerID="ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8" exitCode=0 Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.347435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerDied","Data":"ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.360062 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770"} Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.452602 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477395 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477682 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477799 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.477849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.481154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.481236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") pod \"da9c8289-b4cc-4259-a94e-fab15f437c67\" (UID: \"da9c8289-b4cc-4259-a94e-fab15f437c67\") " Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.485979 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758" (OuterVolumeSpecName: "kube-api-access-q5758") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "kube-api-access-q5758". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.599795 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5758\" (UniqueName: \"kubernetes.io/projected/da9c8289-b4cc-4259-a94e-fab15f437c67-kube-api-access-q5758\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.609569 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8495ffcdcc-j7d29" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.205:9696/\": dial tcp 10.217.0.205:9696: connect: connection refused" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.665563 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.678442 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.726228 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.726272 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.766630 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.769701 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.828952 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.828984 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.915190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config" (OuterVolumeSpecName: "config") pod "da9c8289-b4cc-4259-a94e-fab15f437c67" (UID: "da9c8289-b4cc-4259-a94e-fab15f437c67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:39 crc kubenswrapper[4898]: I0313 14:21:39.933921 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9c8289-b4cc-4259-a94e-fab15f437c67-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.232768 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.257636 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.348819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") pod \"84a7fd24-4320-4c0e-8ded-0d455252a549\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.348891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") pod \"84a7fd24-4320-4c0e-8ded-0d455252a549\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.349023 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") pod \"84a7fd24-4320-4c0e-8ded-0d455252a549\" (UID: \"84a7fd24-4320-4c0e-8ded-0d455252a549\") " Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.490134 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84a7fd24-4320-4c0e-8ded-0d455252a549" (UID: "84a7fd24-4320-4c0e-8ded-0d455252a549"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.497098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" event={"ID":"da9c8289-b4cc-4259-a94e-fab15f437c67","Type":"ContainerDied","Data":"2e5e30a7ea9bdd40efe150ba12afa0ed754dfac9e9a1ebb3308dac4a35e5fade"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.497403 4898 scope.go:117] "RemoveContainer" containerID="ec23679d2099538d875f32f1740477e86fa4744d0786a72fbd40a7500fbf13f8" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.497647 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-7gvqf" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.499130 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft" (OuterVolumeSpecName: "kube-api-access-28mft") pod "84a7fd24-4320-4c0e-8ded-0d455252a549" (UID: "84a7fd24-4320-4c0e-8ded-0d455252a549"). InnerVolumeSpecName "kube-api-access-28mft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.525459 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-zgt75" event={"ID":"84a7fd24-4320-4c0e-8ded-0d455252a549","Type":"ContainerDied","Data":"ff8a73d5234eb1ed4542baaf925a5bae9eff511012c73d58ac5c330a7c07d613"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.525537 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff8a73d5234eb1ed4542baaf925a5bae9eff511012c73d58ac5c330a7c07d613" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.525659 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-zgt75" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.562840 4898 generic.go:334] "Generic (PLEG): container finished" podID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerID="0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b" exitCode=0 Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.563581 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerDied","Data":"0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.566776 4898 scope.go:117] "RemoveContainer" containerID="ebec6588ea54fc0e12abfc618cba17e32b0384e26af2c4dc5438fd6e04229c34" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.582041 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mft\" (UniqueName: \"kubernetes.io/projected/84a7fd24-4320-4c0e-8ded-0d455252a549-kube-api-access-28mft\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.582087 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.589382 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.593762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-776df44c77-g64lv" event={"ID":"4a679fb4-8d85-4835-a048-08c4b61aa158","Type":"ContainerStarted","Data":"3df3fcd326f6b287405d8ae708899fcbf8ba1334da4351d589c918d2c483aab6"} Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.595592 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.643825 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-7gvqf"] Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.652053 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:21:40 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:21:40 crc kubenswrapper[4898]: > Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.657272 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-776df44c77-g64lv" podStartSLOduration=3.657255404 podStartE2EDuration="3.657255404s" podCreationTimestamp="2026-03-13 14:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:40.619815242 +0000 UTC m=+1535.621403491" watchObservedRunningTime="2026-03-13 14:21:40.657255404 +0000 UTC m=+1535.658843633" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.699190 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b9dc95d4b-bvhlz" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.791923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data" (OuterVolumeSpecName: "config-data") pod "84a7fd24-4320-4c0e-8ded-0d455252a549" (UID: "84a7fd24-4320-4c0e-8ded-0d455252a549"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.801032 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84a7fd24-4320-4c0e-8ded-0d455252a549-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.815924 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.816142 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" containerID="cri-o://8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" gracePeriod=30 Mar 13 14:21:40 crc kubenswrapper[4898]: I0313 14:21:40.816633 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" containerID="cri-o://e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" gracePeriod=30 Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.056797 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.212757 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.212845 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213140 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213210 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213295 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213375 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.213538 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") pod \"194cc0b9-5fb1-492c-9df1-002f629cfb90\" (UID: \"194cc0b9-5fb1-492c-9df1-002f629cfb90\") " Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.229663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.237643 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x" (OuterVolumeSpecName: "kube-api-access-pvf8x") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "kube-api-access-pvf8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.317881 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.317946 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvf8x\" (UniqueName: \"kubernetes.io/projected/194cc0b9-5fb1-492c-9df1-002f629cfb90-kube-api-access-pvf8x\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.330125 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.333008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.363497 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.392129 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config" (OuterVolumeSpecName: "config") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.421881 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.422705 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.422729 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.422741 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.523076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "194cc0b9-5fb1-492c-9df1-002f629cfb90" (UID: "194cc0b9-5fb1-492c-9df1-002f629cfb90"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.525048 4898 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/194cc0b9-5fb1-492c-9df1-002f629cfb90-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.622522 4898 generic.go:334] "Generic (PLEG): container finished" podID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerID="c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51" exitCode=0 Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.622659 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerDied","Data":"c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.628105 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8495ffcdcc-j7d29" event={"ID":"194cc0b9-5fb1-492c-9df1-002f629cfb90","Type":"ContainerDied","Data":"675cba3392edee8aa8b36b03aeac2453edffb374fe9e8c521c269c0464cb1478"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.628177 4898 scope.go:117] "RemoveContainer" containerID="0cd127d8d8dce19d301ba8f94cb9ff0fc6150499598490bd11d503771e98d4fc" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.628335 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8495ffcdcc-j7d29" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.667688 4898 generic.go:334] "Generic (PLEG): container finished" podID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" exitCode=143 Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.667756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerDied","Data":"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.718333 4898 scope.go:117] "RemoveContainer" containerID="0d21f5d009c2fbb3d9136543ddc9edaf66018231eb09d0ea5bf4fa35c2144f9b" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.718725 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.719800 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.719947 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242"} Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.812254 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" path="/var/lib/kubelet/pods/da9c8289-b4cc-4259-a94e-fab15f437c67/volumes" Mar 13 14:21:41 crc kubenswrapper[4898]: I0313 14:21:41.813620 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8495ffcdcc-j7d29"] Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.328302 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.406932 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407019 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407166 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407195 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407222 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407247 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") pod \"193b05da-acb9-4512-a2ae-6c03450e6f05\" (UID: \"193b05da-acb9-4512-a2ae-6c03450e6f05\") " Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.407893 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.415067 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts" (OuterVolumeSpecName: "scripts") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.419028 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.428160 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7" (OuterVolumeSpecName: "kube-api-access-82hs7") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "kube-api-access-82hs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.465571 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.509779 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193b05da-acb9-4512-a2ae-6c03450e6f05-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510133 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510145 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510154 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82hs7\" (UniqueName: \"kubernetes.io/projected/193b05da-acb9-4512-a2ae-6c03450e6f05-kube-api-access-82hs7\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.510165 4898 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.525025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data" (OuterVolumeSpecName: "config-data") pod "193b05da-acb9-4512-a2ae-6c03450e6f05" (UID: "193b05da-acb9-4512-a2ae-6c03450e6f05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.612667 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193b05da-acb9-4512-a2ae-6c03450e6f05-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.762176 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" path="/var/lib/kubelet/pods/194cc0b9-5fb1-492c-9df1-002f629cfb90/volumes" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.766841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerStarted","Data":"a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0"} Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.769009 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.771691 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-ztp6c" event={"ID":"193b05da-acb9-4512-a2ae-6c03450e6f05","Type":"ContainerDied","Data":"bfe3b6cf0e5928312929ea860aeb7b7f643553f3479a3beac4f364f3ff4502ae"} Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.771722 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfe3b6cf0e5928312929ea860aeb7b7f643553f3479a3beac4f364f3ff4502ae" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.771776 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-ztp6c" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.803456 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.017630371 podStartE2EDuration="7.803431187s" podCreationTimestamp="2026-03-13 14:21:36 +0000 UTC" firstStartedPulling="2026-03-13 14:21:37.372052012 +0000 UTC m=+1532.373640251" lastFinishedPulling="2026-03-13 14:21:43.157852838 +0000 UTC m=+1538.159441067" observedRunningTime="2026-03-13 14:21:43.797346459 +0000 UTC m=+1538.798934708" watchObservedRunningTime="2026-03-13 14:21:43.803431187 +0000 UTC m=+1538.805019426" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.949433 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950219 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950246 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950268 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerName="heat-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950279 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerName="heat-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950307 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950314 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950340 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="init" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950359 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="init" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950382 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerName="cinder-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950389 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerName="cinder-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: E0313 14:21:43.950407 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950414 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950708 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-api" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950734 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" containerName="heat-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950747 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="194cc0b9-5fb1-492c-9df1-002f629cfb90" containerName="neutron-httpd" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950767 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="da9c8289-b4cc-4259-a94e-fab15f437c67" containerName="dnsmasq-dns" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.950779 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" containerName="cinder-db-sync" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.952514 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.956018 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dcn2n" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.956314 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.956502 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.959697 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 14:21:43 crc kubenswrapper[4898]: I0313 14:21:43.988198 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028142 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028199 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028232 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028339 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.028370 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.047108 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.049988 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.083697 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.173249 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174385 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174670 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174732 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.174765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175007 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175188 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.175491 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.176029 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.218072 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.223785 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.254807 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.255573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.286824 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.323749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.323872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324001 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324342 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324714 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.324814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.327656 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.328127 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.329226 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.330090 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.330470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.363772 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"dnsmasq-dns-5c9776ccc5-gtnnh\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.415137 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.417016 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.422284 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.445795 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.500068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552551 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552690 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552801 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552835 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552883 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552966 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.552992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.583434 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656482 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656587 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.656611 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.657918 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.658255 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.667295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.667363 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.668870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.670572 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.711708 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cinder-api-0\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " pod="openstack/cinder-api-0" Mar 13 14:21:44 crc kubenswrapper[4898]: I0313 14:21:44.799447 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.040217 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.211:9311/healthcheck\": read tcp 10.217.0.2:52740->10.217.0.211:9311: read: connection reset by peer" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.040418 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d74d977fd-v5m5s" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.211:9311/healthcheck\": read tcp 10.217.0.2:52756->10.217.0.211:9311: read: connection reset by peer" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.251427 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.439169 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:45 crc kubenswrapper[4898]: W0313 14:21:45.546749 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0be2003_4a0d_4740_9b84_ab16bb27d5bb.slice/crio-519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180 WatchSource:0}: Error finding container 519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180: Status 404 returned error can't find the container with id 519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180 Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.868158 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.891085 4898 generic.go:334] "Generic (PLEG): container finished" podID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" exitCode=0 Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.891186 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d74d977fd-v5m5s" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.891212 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerDied","Data":"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.892208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d74d977fd-v5m5s" event={"ID":"7dcea9de-db8a-42dd-958c-59df43a49ff3","Type":"ContainerDied","Data":"704f9e3ccfefb3b8a00bd2333c1274e405dc0d43a2c31a48365c27cab56dfc29"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.892234 4898 scope.go:117] "RemoveContainer" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.905865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.907541 4898 generic.go:334] "Generic (PLEG): container finished" podID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerID="6c42ee9c0a17acfdf5d9f3b6de5ee36bb640854b185b6dd5e7f1e7441cc93008" exitCode=0 Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.907629 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerDied","Data":"6c42ee9c0a17acfdf5d9f3b6de5ee36bb640854b185b6dd5e7f1e7441cc93008"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.907655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerStarted","Data":"2f6cf6b2237006a47af92c80edb293fb5e39aa92cbe683d435727b4ad4952d2e"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.911640 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerStarted","Data":"519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180"} Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.928856 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929213 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929268 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929432 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929559 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") pod \"7dcea9de-db8a-42dd-958c-59df43a49ff3\" (UID: \"7dcea9de-db8a-42dd-958c-59df43a49ff3\") " Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.929804 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs" (OuterVolumeSpecName: "logs") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.931527 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7dcea9de-db8a-42dd-958c-59df43a49ff3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.950714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km" (OuterVolumeSpecName: "kube-api-access-vt6km") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "kube-api-access-vt6km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.966013 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:45 crc kubenswrapper[4898]: I0313 14:21:45.969776 4898 scope.go:117] "RemoveContainer" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.033669 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.033699 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt6km\" (UniqueName: \"kubernetes.io/projected/7dcea9de-db8a-42dd-958c-59df43a49ff3-kube-api-access-vt6km\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.039770 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.049161 4898 scope.go:117] "RemoveContainer" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" Mar 13 14:21:46 crc kubenswrapper[4898]: E0313 14:21:46.055173 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d\": container with ID starting with e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d not found: ID does not exist" containerID="e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.055240 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d"} err="failed to get container status \"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d\": rpc error: code = NotFound desc = could not find container \"e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d\": container with ID starting with e2a10450307b6355906b72fd4b0a882c5720f8e92bcad91beb1384ffe656972d not found: ID does not exist" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.055270 4898 scope.go:117] "RemoveContainer" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" Mar 13 14:21:46 crc kubenswrapper[4898]: E0313 14:21:46.060184 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221\": container with ID starting with 8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221 not found: ID does not exist" containerID="8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.060234 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221"} err="failed to get container status \"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221\": rpc error: code = NotFound desc = could not find container \"8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221\": container with ID starting with 8d6bd5023b5a1087811735b70ac1c3323bdc9e802f224e7ede20161093a84221 not found: ID does not exist" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.070299 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data" (OuterVolumeSpecName: "config-data") pod "7dcea9de-db8a-42dd-958c-59df43a49ff3" (UID: "7dcea9de-db8a-42dd-958c-59df43a49ff3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.143374 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.143433 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dcea9de-db8a-42dd-958c-59df43a49ff3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.312576 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.328182 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d74d977fd-v5m5s"] Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.968199 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.968774 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerStarted","Data":"4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3"} Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.969776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:46 crc kubenswrapper[4898]: I0313 14:21:46.979079 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerStarted","Data":"b5239aec37b0d4b7e60e804b0a75fb330f40a9a58da7e482e13786d0363b1b93"} Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.756102 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" path="/var/lib/kubelet/pods/7dcea9de-db8a-42dd-958c-59df43a49ff3/volumes" Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.831891 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.858757 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" podStartSLOduration=4.858738161 podStartE2EDuration="4.858738161s" podCreationTimestamp="2026-03-13 14:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:47.008384696 +0000 UTC m=+1542.009972935" watchObservedRunningTime="2026-03-13 14:21:47.858738161 +0000 UTC m=+1542.860326400" Mar 13 14:21:47 crc kubenswrapper[4898]: I0313 14:21:47.960826 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.055256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerStarted","Data":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.256653 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-647f998784-xvcjw"] Mar 13 14:21:48 crc kubenswrapper[4898]: E0313 14:21:48.257167 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257181 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" Mar 13 14:21:48 crc kubenswrapper[4898]: E0313 14:21:48.257206 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257212 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257415 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.257441 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dcea9de-db8a-42dd-958c-59df43a49ff3" containerName="barbican-api-log" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.259581 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.281562 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-647f998784-xvcjw"] Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453469 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-scripts\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453766 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa7825b5-b19b-44bb-8d23-bb121e669780-logs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms92j\" (UniqueName: \"kubernetes.io/projected/fa7825b5-b19b-44bb-8d23-bb121e669780-kube-api-access-ms92j\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.453966 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-internal-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.454246 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-config-data\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.454333 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-combined-ca-bundle\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.454411 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-public-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.557268 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-public-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558304 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-scripts\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558375 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa7825b5-b19b-44bb-8d23-bb121e669780-logs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558820 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa7825b5-b19b-44bb-8d23-bb121e669780-logs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.558942 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms92j\" (UniqueName: \"kubernetes.io/projected/fa7825b5-b19b-44bb-8d23-bb121e669780-kube-api-access-ms92j\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.559119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-internal-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.559167 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-config-data\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.559182 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-combined-ca-bundle\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.564418 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-scripts\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.564799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-public-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.567115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-config-data\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.571593 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-internal-tls-certs\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.571672 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa7825b5-b19b-44bb-8d23-bb121e669780-combined-ca-bundle\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.590001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms92j\" (UniqueName: \"kubernetes.io/projected/fa7825b5-b19b-44bb-8d23-bb121e669780-kube-api-access-ms92j\") pod \"placement-647f998784-xvcjw\" (UID: \"fa7825b5-b19b-44bb-8d23-bb121e669780\") " pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:48 crc kubenswrapper[4898]: I0313 14:21:48.635148 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.085435 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerStarted","Data":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.085712 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" containerID="cri-o://d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" gracePeriod=30 Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.086195 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" containerID="cri-o://6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" gracePeriod=30 Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.086257 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.098790 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerStarted","Data":"bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911"} Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.098821 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerStarted","Data":"7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd"} Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.119697 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.119668624 podStartE2EDuration="5.119668624s" podCreationTimestamp="2026-03-13 14:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:49.113110913 +0000 UTC m=+1544.114699172" watchObservedRunningTime="2026-03-13 14:21:49.119668624 +0000 UTC m=+1544.121256863" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.139585 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.139651 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.163085 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.080350573 podStartE2EDuration="6.16305711s" podCreationTimestamp="2026-03-13 14:21:43 +0000 UTC" firstStartedPulling="2026-03-13 14:21:45.56477082 +0000 UTC m=+1540.566359059" lastFinishedPulling="2026-03-13 14:21:46.647477357 +0000 UTC m=+1541.649065596" observedRunningTime="2026-03-13 14:21:49.140479204 +0000 UTC m=+1544.142067463" watchObservedRunningTime="2026-03-13 14:21:49.16305711 +0000 UTC m=+1544.164645359" Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.319729 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-647f998784-xvcjw"] Mar 13 14:21:49 crc kubenswrapper[4898]: I0313 14:21:49.584557 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.002370 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110048 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110219 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110514 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110539 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.110556 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") pod \"cb7c4601-9945-444b-8a00-a671ce18bb1e\" (UID: \"cb7c4601-9945-444b-8a00-a671ce18bb1e\") " Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.113057 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs" (OuterVolumeSpecName: "logs") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.113616 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.118103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx" (OuterVolumeSpecName: "kube-api-access-zwbrx") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "kube-api-access-zwbrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.118585 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.118870 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" exitCode=0 Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerDied","Data":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119061 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119082 4898 scope.go:117] "RemoveContainer" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119021 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" exitCode=143 Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119066 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerDied","Data":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.119738 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cb7c4601-9945-444b-8a00-a671ce18bb1e","Type":"ContainerDied","Data":"b5239aec37b0d4b7e60e804b0a75fb330f40a9a58da7e482e13786d0363b1b93"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.124154 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-647f998784-xvcjw" event={"ID":"fa7825b5-b19b-44bb-8d23-bb121e669780","Type":"ContainerStarted","Data":"e7d8a6f5bcff44a554794a7336db9d2235c8a1237b68565c7974ac14bf58a726"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.124390 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-647f998784-xvcjw" event={"ID":"fa7825b5-b19b-44bb-8d23-bb121e669780","Type":"ContainerStarted","Data":"7c2e07ab88f2b898d60cd6d882f9de9a202887ed9045a7d1964343ba89d7b9b6"} Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.127224 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts" (OuterVolumeSpecName: "scripts") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.144858 4898 scope.go:117] "RemoveContainer" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.164614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.165708 4898 scope.go:117] "RemoveContainer" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.166227 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": container with ID starting with 6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323 not found: ID does not exist" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166285 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} err="failed to get container status \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": rpc error: code = NotFound desc = could not find container \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": container with ID starting with 6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323 not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166321 4898 scope.go:117] "RemoveContainer" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.166650 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": container with ID starting with d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c not found: ID does not exist" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166675 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} err="failed to get container status \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": rpc error: code = NotFound desc = could not find container \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": container with ID starting with d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166690 4898 scope.go:117] "RemoveContainer" containerID="6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166913 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323"} err="failed to get container status \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": rpc error: code = NotFound desc = could not find container \"6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323\": container with ID starting with 6da96ed31599ca832cee856f0c94ff97183486a2078689f1689d4f88f2dd2323 not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.166941 4898 scope.go:117] "RemoveContainer" containerID="d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.167183 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c"} err="failed to get container status \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": rpc error: code = NotFound desc = could not find container \"d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c\": container with ID starting with d8870881d2d90e176629bb217b8c041bc075048a36f864f12b6235e672810e1c not found: ID does not exist" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.203402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data" (OuterVolumeSpecName: "config-data") pod "cb7c4601-9945-444b-8a00-a671ce18bb1e" (UID: "cb7c4601-9945-444b-8a00-a671ce18bb1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214369 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb7c4601-9945-444b-8a00-a671ce18bb1e-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214791 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwbrx\" (UniqueName: \"kubernetes.io/projected/cb7c4601-9945-444b-8a00-a671ce18bb1e-kube-api-access-zwbrx\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214807 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb7c4601-9945-444b-8a00-a671ce18bb1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214823 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214836 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214848 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.214863 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb7c4601-9945-444b-8a00-a671ce18bb1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.467089 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.489609 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.506450 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.507027 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507047 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" Mar 13 14:21:50 crc kubenswrapper[4898]: E0313 14:21:50.507069 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507076 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507324 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api-log" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.507360 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" containerName="cinder-api" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.508751 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.517397 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.532166 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.532367 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.532553 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.580271 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-87574c74-kqmjb" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrhs\" (UniqueName: \"kubernetes.io/projected/bda33d23-490a-4099-954b-c613ab5d5c73-kube-api-access-hcrhs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623563 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623676 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda33d23-490a-4099-954b-c613ab5d5c73-logs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-scripts\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623753 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623802 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.623879 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bda33d23-490a-4099-954b-c613ab5d5c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.632088 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:21:50 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:21:50 crc kubenswrapper[4898]: > Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725245 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725303 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda33d23-490a-4099-954b-c613ab5d5c73-logs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-scripts\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725399 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725453 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725583 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bda33d23-490a-4099-954b-c613ab5d5c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrhs\" (UniqueName: \"kubernetes.io/projected/bda33d23-490a-4099-954b-c613ab5d5c73-kube-api-access-hcrhs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.725710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.734509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bda33d23-490a-4099-954b-c613ab5d5c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.735080 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bda33d23-490a-4099-954b-c613ab5d5c73-logs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.751481 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.753830 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.767505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrhs\" (UniqueName: \"kubernetes.io/projected/bda33d23-490a-4099-954b-c613ab5d5c73-kube-api-access-hcrhs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.770001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-config-data\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.774845 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-public-tls-certs\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.776478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-scripts\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.782287 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bda33d23-490a-4099-954b-c613ab5d5c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bda33d23-490a-4099-954b-c613ab5d5c73\") " pod="openstack/cinder-api-0" Mar 13 14:21:50 crc kubenswrapper[4898]: I0313 14:21:50.882096 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.172702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-647f998784-xvcjw" event={"ID":"fa7825b5-b19b-44bb-8d23-bb121e669780","Type":"ContainerStarted","Data":"d5182af4e2015471f641e75a51f4b3c088c6d29f03d19e8012d182b598de84fd"} Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.231631 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-647f998784-xvcjw" podStartSLOduration=3.231612419 podStartE2EDuration="3.231612419s" podCreationTimestamp="2026-03-13 14:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:51.22277733 +0000 UTC m=+1546.224365579" watchObservedRunningTime="2026-03-13 14:21:51.231612419 +0000 UTC m=+1546.233200658" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.648830 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.762377 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb7c4601-9945-444b-8a00-a671ce18bb1e" path="/var/lib/kubelet/pods/cb7c4601-9945-444b-8a00-a671ce18bb1e/volumes" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.764090 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.768884 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.774369 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.774608 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.774855 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zrf8c" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.790806 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.854384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2b5z\" (UniqueName: \"kubernetes.io/projected/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-kube-api-access-m2b5z\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.854751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.855017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config-secret\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.855322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.956939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config-secret\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957123 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2b5z\" (UniqueName: \"kubernetes.io/projected/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-kube-api-access-m2b5z\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.957926 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.963067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-openstack-config-secret\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.973453 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-combined-ca-bundle\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:51 crc kubenswrapper[4898]: I0313 14:21:51.974104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2b5z\" (UniqueName: \"kubernetes.io/projected/124bd4ee-d9f0-408f-a46e-4d143e8ab02a-kube-api-access-m2b5z\") pod \"openstackclient\" (UID: \"124bd4ee-d9f0-408f-a46e-4d143e8ab02a\") " pod="openstack/openstackclient" Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.105268 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.277334 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bda33d23-490a-4099-954b-c613ab5d5c73","Type":"ContainerStarted","Data":"262dcf4e15f8ce792ae3571e3d3a1ec83ec0fd3784f8077ccfb70ec4bfa3c297"} Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.278375 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.278461 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:52 crc kubenswrapper[4898]: W0313 14:21:52.632327 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod124bd4ee_d9f0_408f_a46e_4d143e8ab02a.slice/crio-fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312 WatchSource:0}: Error finding container fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312: Status 404 returned error can't find the container with id fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312 Mar 13 14:21:52 crc kubenswrapper[4898]: I0313 14:21:52.635557 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 14:21:53 crc kubenswrapper[4898]: I0313 14:21:53.293507 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bda33d23-490a-4099-954b-c613ab5d5c73","Type":"ContainerStarted","Data":"2c4982813ff025725e309ae1fb5c221c16834addb31306bfb34d1f38cc6f5b58"} Mar 13 14:21:53 crc kubenswrapper[4898]: I0313 14:21:53.300545 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"124bd4ee-d9f0-408f-a46e-4d143e8ab02a","Type":"ContainerStarted","Data":"fdc74456375dc14a674a713f421a590760eba34f5944defa77d2619e4b04d312"} Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.308118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bda33d23-490a-4099-954b-c613ab5d5c73","Type":"ContainerStarted","Data":"9df64f67ff0ac5343dcc1f672969f9ff840a1bd2e72ed128eee29135694d9522"} Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.335350 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.33532748 podStartE2EDuration="4.33532748s" podCreationTimestamp="2026-03-13 14:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:21:54.328604905 +0000 UTC m=+1549.330193154" watchObservedRunningTime="2026-03-13 14:21:54.33532748 +0000 UTC m=+1549.336915709" Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.450804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.544201 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.544435 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" containerID="cri-o://6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5" gracePeriod=10 Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.872038 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 14:21:54 crc kubenswrapper[4898]: I0313 14:21:54.931006 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.322143 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" containerID="cri-o://7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd" gracePeriod=30 Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.322315 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" containerID="cri-o://bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911" gracePeriod=30 Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.323359 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 14:21:55 crc kubenswrapper[4898]: I0313 14:21:55.580667 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.349210 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerID="bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911" exitCode=0 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.350768 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerDied","Data":"bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911"} Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.354842 4898 generic.go:334] "Generic (PLEG): container finished" podID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerID="6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5" exitCode=0 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.356186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerDied","Data":"6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5"} Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.356211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" event={"ID":"45af301d-29c9-474d-be0d-4d91f6d0cb18","Type":"ContainerDied","Data":"54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec"} Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.356220 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b39114a6c35a297bd4da40e16643230d03c158b2784fe677b7a7dcda81e6ec" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.402364 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588714 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588760 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588787 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588843 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.588860 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") pod \"45af301d-29c9-474d-be0d-4d91f6d0cb18\" (UID: \"45af301d-29c9-474d-be0d-4d91f6d0cb18\") " Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.596315 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv" (OuterVolumeSpecName: "kube-api-access-hmplv") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "kube-api-access-hmplv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.632654 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-647f998784-xvcjw" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.670980 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.671528 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.692347 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.692381 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmplv\" (UniqueName: \"kubernetes.io/projected/45af301d-29c9-474d-be0d-4d91f6d0cb18-kube-api-access-hmplv\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.692392 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.698416 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.714817 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.737832 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.743349 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5bf5d8b7d4-4gwxr" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" containerID="cri-o://7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32" gracePeriod=30 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.743475 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5bf5d8b7d4-4gwxr" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" containerID="cri-o://974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819" gracePeriod=30 Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.753713 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config" (OuterVolumeSpecName: "config") pod "45af301d-29c9-474d-be0d-4d91f6d0cb18" (UID: "45af301d-29c9-474d-be0d-4d91f6d0cb18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.794669 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.794695 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:56 crc kubenswrapper[4898]: I0313 14:21:56.794704 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45af301d-29c9-474d-be0d-4d91f6d0cb18-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.370434 4898 generic.go:334] "Generic (PLEG): container finished" podID="604a0205-6c18-4bff-929f-038524d62aeb" containerID="7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32" exitCode=143 Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.370525 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8d6wc" Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.370809 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerDied","Data":"7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32"} Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.424444 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.444801 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8d6wc"] Mar 13 14:21:57 crc kubenswrapper[4898]: I0313 14:21:57.780625 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" path="/var/lib/kubelet/pods/45af301d-29c9-474d-be0d-4d91f6d0cb18/volumes" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.255710 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f9cbdc5df-5tx5z"] Mar 13 14:21:59 crc kubenswrapper[4898]: E0313 14:21:59.256504 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.256519 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" Mar 13 14:21:59 crc kubenswrapper[4898]: E0313 14:21:59.256565 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="init" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.256572 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="init" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.256805 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="45af301d-29c9-474d-be0d-4d91f6d0cb18" containerName="dnsmasq-dns" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.257943 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.261841 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.262497 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.262619 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.274640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f9cbdc5df-5tx5z"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.371131 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.372682 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-public-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375480 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-etc-swift\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375659 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-config-data\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-run-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-combined-ca-bundle\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375852 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-log-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375877 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-internal-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.375954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.376106 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrpt2\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-kube-api-access-rrpt2\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.376279 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.378999 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-d5fsz" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.379263 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.379429 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.384576 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-public-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482186 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-etc-swift\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482216 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-config-data\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-run-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-combined-ca-bundle\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482332 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-log-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482349 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-internal-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482440 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrpt2\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-kube-api-access-rrpt2\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.482498 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.490194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-run-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.492075 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.492671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.492776 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-public-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.494412 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a57db04-0dc9-4d63-8d08-dd4309b19496-log-httpd\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.496954 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.499953 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.516777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.516864 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-config-data\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.517057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.524007 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-combined-ca-bundle\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.538295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a57db04-0dc9-4d63-8d08-dd4309b19496-internal-tls-certs\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.539532 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"heat-engine-6b86699784-tf822\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.540467 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-etc-swift\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.542132 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrpt2\" (UniqueName: \"kubernetes.io/projected/1a57db04-0dc9-4d63-8d08-dd4309b19496-kube-api-access-rrpt2\") pod \"swift-proxy-7f9cbdc5df-5tx5z\" (UID: \"1a57db04-0dc9-4d63-8d08-dd4309b19496\") " pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584384 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584929 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584960 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.584977 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.585034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.585055 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.606186 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.634494 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.638320 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.645856 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.669091 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.687854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.687939 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.688037 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.688067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.688138 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.689507 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.690505 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.691021 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.693854 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.693946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.697146 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.700953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.702542 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.704641 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.708241 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.734035 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"dnsmasq-dns-7756b9d78c-xntfr\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.737655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792472 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792678 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.792726 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.894774 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.894917 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.894986 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895052 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895126 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.895603 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.902633 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.904038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.911365 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.914369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"heat-api-9b6c99f6d-7zgm5\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.978539 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.997991 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.998106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.998891 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.999210 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:21:59 crc kubenswrapper[4898]: I0313 14:21:59.999288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.001582 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.003945 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.004268 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.019848 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"heat-cfnapi-759d64ffd4-kzp67\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.076312 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.148953 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.151238 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.167512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.168991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.169541 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.170052 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.205366 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"auto-csr-approver-29556862-mpx4w\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.310866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"auto-csr-approver-29556862-mpx4w\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.330651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"auto-csr-approver-29556862-mpx4w\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.461369 4898 generic.go:334] "Generic (PLEG): container finished" podID="604a0205-6c18-4bff-929f-038524d62aeb" containerID="974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819" exitCode=0 Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.461441 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerDied","Data":"974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819"} Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.508224 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:00 crc kubenswrapper[4898]: I0313 14:22:00.682277 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:00 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:00 crc kubenswrapper[4898]: > Mar 13 14:22:02 crc kubenswrapper[4898]: I0313 14:22:02.486603 4898 generic.go:334] "Generic (PLEG): container finished" podID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerID="7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd" exitCode=0 Mar 13 14:22:02 crc kubenswrapper[4898]: I0313 14:22:02.486677 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerDied","Data":"7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd"} Mar 13 14:22:04 crc kubenswrapper[4898]: I0313 14:22:04.893121 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-api-0" podUID="bda33d23-490a-4099-954b-c613ab5d5c73" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.219:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:22:05 crc kubenswrapper[4898]: I0313 14:22:05.888087 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="bda33d23-490a-4099-954b-c613ab5d5c73" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.219:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.422422 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.423986 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.438478 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.440254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.451227 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.467351 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.482551 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.487941 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.502721 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.525618 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526184 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526499 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.526924 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.527125 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.527320 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632577 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632601 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632740 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.632889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633022 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633054 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633392 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633557 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.633578 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.644503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.646598 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.647178 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.655266 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.656017 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.658579 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"heat-api-5978dd6d84-pnknr\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.663722 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.665878 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"heat-engine-5b6c75676b-jx6kl\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735148 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735378 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.735463 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.741192 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.749888 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.752642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.760573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"heat-cfnapi-76545f46cd-qk7nm\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.769201 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.810954 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.849140 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:06 crc kubenswrapper[4898]: I0313 14:22:06.872254 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.796493 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-776df44c77-g64lv" Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.890465 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.890763 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f97c64464-wmnph" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" containerID="cri-o://6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" gracePeriod=30 Mar 13 14:22:07 crc kubenswrapper[4898]: I0313 14:22:07.892337 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f97c64464-wmnph" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" containerID="cri-o://fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.112057 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.306533 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313756 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313828 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313864 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313923 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313945 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.313989 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314071 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314100 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314161 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314186 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314249 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314278 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") pod \"604a0205-6c18-4bff-929f-038524d62aeb\" (UID: \"604a0205-6c18-4bff-929f-038524d62aeb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.314307 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") pod \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\" (UID: \"d0be2003-4a0d-4740-9b84-ab16bb27d5bb\") " Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.326922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs" (OuterVolumeSpecName: "logs") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.327277 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.348923 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt" (OuterVolumeSpecName: "kube-api-access-8lcxt") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "kube-api-access-8lcxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.349424 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts" (OuterVolumeSpecName: "scripts") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.349843 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.362164 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts" (OuterVolumeSpecName: "scripts") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.392028 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw" (OuterVolumeSpecName: "kube-api-access-8wbnw") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "kube-api-access-8wbnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417067 4898 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417108 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbnw\" (UniqueName: \"kubernetes.io/projected/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-kube-api-access-8wbnw\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417124 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/604a0205-6c18-4bff-929f-038524d62aeb-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417136 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417147 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lcxt\" (UniqueName: \"kubernetes.io/projected/604a0205-6c18-4bff-929f-038524d62aeb-kube-api-access-8lcxt\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417156 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.417166 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.586304 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.629059 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.632785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d0be2003-4a0d-4740-9b84-ab16bb27d5bb","Type":"ContainerDied","Data":"519cfe6e0c33250225df4155f05016fa2ba7c8a1bb229003e79f90a22978c180"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.632849 4898 scope.go:117] "RemoveContainer" containerID="bafd55f18270e2838b59b970f12a6aafedd22b09a08b2d2305599aac961e6911" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.633042 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.638283 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bf5d8b7d4-4gwxr" event={"ID":"604a0205-6c18-4bff-929f-038524d62aeb","Type":"ContainerDied","Data":"762f4a73911e1feb65bcc4bfa54920b33c4f644904f274acd49cf5dac053c911"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.638405 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bf5d8b7d4-4gwxr" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.640341 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.640765 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" containerID="cri-o://b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.641094 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" containerID="cri-o://a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.641155 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" containerID="cri-o://ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.641201 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" containerID="cri-o://cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242" gracePeriod=30 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.657655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"124bd4ee-d9f0-408f-a46e-4d143e8ab02a","Type":"ContainerStarted","Data":"2280ba58973731a979fad5257ad117a923cdfbcb04a055a7d907232e58c71a1c"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.664621 4898 generic.go:334] "Generic (PLEG): container finished" podID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" exitCode=0 Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.664875 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerDied","Data":"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4"} Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.665988 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.694093 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.696068 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data" (OuterVolumeSpecName: "config-data") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.733346 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.733392 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.733404 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.735385 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.767185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data" (OuterVolumeSpecName: "config-data") pod "d0be2003-4a0d-4740-9b84-ab16bb27d5bb" (UID: "d0be2003-4a0d-4740-9b84-ab16bb27d5bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.813868 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "604a0205-6c18-4bff-929f-038524d62aeb" (UID: "604a0205-6c18-4bff-929f-038524d62aeb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.837039 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/604a0205-6c18-4bff-929f-038524d62aeb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.837078 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0be2003-4a0d-4740-9b84-ab16bb27d5bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:08 crc kubenswrapper[4898]: I0313 14:22:08.958160 4898 scope.go:117] "RemoveContainer" containerID="7f741df9b30455c96ea278c501c4d63ab1af4bf960776e0726cfee685819c6bd" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.018965 4898 scope.go:117] "RemoveContainer" containerID="974d31c9fc27c07800ab40a1409496611af9fce07ca6cd7d36cd6abbd352b819" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.039264 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.065859 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5bf5d8b7d4-4gwxr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.068677 4898 scope.go:117] "RemoveContainer" containerID="7d8e485964b16b478ba92c0abf89ef5c7fe78d1b940606e5a01f0ef264ccdb32" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.070116 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.084166 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.100952 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.122968 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123530 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123548 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123569 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123576 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123615 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123623 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" Mar 13 14:22:09 crc kubenswrapper[4898]: E0313 14:22:09.123634 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123639 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123839 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="probe" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123856 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-api" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123869 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="604a0205-6c18-4bff-929f-038524d62aeb" containerName="placement-log" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.123887 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" containerName="cinder-scheduler" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.132778 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.137663 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.142361 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.329366 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.336312 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.348681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.352483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.352952 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j59jd\" (UniqueName: \"kubernetes.io/projected/a9a7064c-4ed5-4948-9e7e-7d40794e371e-kube-api-access-j59jd\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.353223 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9a7064c-4ed5-4948-9e7e-7d40794e371e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.353481 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.338179 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:09 crc kubenswrapper[4898]: W0313 14:22:09.349078 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88ab3ad2_782a_4c21_8104_1b80468dbca0.slice/crio-92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b WatchSource:0}: Error finding container 92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b: Status 404 returned error can't find the container with id 92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.388057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.460274 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j59jd\" (UniqueName: \"kubernetes.io/projected/a9a7064c-4ed5-4948-9e7e-7d40794e371e-kube-api-access-j59jd\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461285 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9a7064c-4ed5-4948-9e7e-7d40794e371e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461645 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.461858 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.467018 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a9a7064c-4ed5-4948-9e7e-7d40794e371e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.470437 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.487338 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.488486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.489475 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9a7064c-4ed5-4948-9e7e-7d40794e371e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.512581 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j59jd\" (UniqueName: \"kubernetes.io/projected/a9a7064c-4ed5-4948-9e7e-7d40794e371e-kube-api-access-j59jd\") pod \"cinder-scheduler-0\" (UID: \"a9a7064c-4ed5-4948-9e7e-7d40794e371e\") " pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.579723 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f9cbdc5df-5tx5z"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.607481 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.815433 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604a0205-6c18-4bff-929f-038524d62aeb" path="/var/lib/kubelet/pods/604a0205-6c18-4bff-929f-038524d62aeb/volumes" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.822244 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0be2003-4a0d-4740-9b84-ab16bb27d5bb" path="/var/lib/kubelet/pods/d0be2003-4a0d-4740-9b84-ab16bb27d5bb/volumes" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.834792 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.837556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" event={"ID":"1a57db04-0dc9-4d63-8d08-dd4309b19496","Type":"ContainerStarted","Data":"2ecb041f0f57627a4628f79981fbe3e37f771ddc6fb2d41c9c87e318fbbafd91"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.859450 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868520 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0" exitCode=0 Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868559 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c" exitCode=2 Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868569 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770" exitCode=0 Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868692 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.868704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.872735 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.880191 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.890470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.891842 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.898200 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerStarted","Data":"83eeb629576ceac1e4c3211f16c303bcac0592684b9a1724a45b87fae4f69938"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.914218 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerStarted","Data":"4e540b110f512e47246a7e1e8d332b0a3a04ba375a44ae46bbb913a91b51ab5a"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.915733 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.933226 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.934838 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.939486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerStarted","Data":"9c7395afa41324a0f82874b3c28b6ce2289ed61f6812ec39a8b7176eb1dd6a99"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.940388 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.946225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerStarted","Data":"92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b"} Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.947931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984224 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984444 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.984941 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:09 crc kubenswrapper[4898]: I0313 14:22:09.990816 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.023486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:22:10 crc kubenswrapper[4898]: W0313 14:22:10.051205 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c85cb04_363e_45d6_a14b_79c249e8f469.slice/crio-f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40 WatchSource:0}: Error finding container f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40: Status 404 returned error can't find the container with id f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40 Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.070569 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.087856 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.092594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.092808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.093008 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095547 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095962 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.095989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.096046 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.096120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.098097 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.098320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.102463 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.118098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.137404 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=4.039866858 podStartE2EDuration="19.137371682s" podCreationTimestamp="2026-03-13 14:21:51 +0000 UTC" firstStartedPulling="2026-03-13 14:21:52.635438901 +0000 UTC m=+1547.637027140" lastFinishedPulling="2026-03-13 14:22:07.732943725 +0000 UTC m=+1562.734531964" observedRunningTime="2026-03-13 14:22:09.986322241 +0000 UTC m=+1564.987910480" watchObservedRunningTime="2026-03-13 14:22:10.137371682 +0000 UTC m=+1565.138959911" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.178550 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.181753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.187485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.187647 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.187770 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"heat-api-5f97b49ff6-67dbr\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.204857 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206470 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206598 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206706 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.206831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.207014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.207115 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.219465 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.225421 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.225638 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.226070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.233270 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.245067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"heat-cfnapi-76b5758c54-vpp67\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.250772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.275585 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.640396 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.641443 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:10 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:10 crc kubenswrapper[4898]: > Mar 13 14:22:10 crc kubenswrapper[4898]: W0313 14:22:10.657118 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9a7064c_4ed5_4948_9e7e_7d40794e371e.slice/crio-fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6 WatchSource:0}: Error finding container fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6: Status 404 returned error can't find the container with id fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6 Mar 13 14:22:10 crc kubenswrapper[4898]: I0313 14:22:10.973121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerStarted","Data":"703728a9002cd85f33faecdfc398f12cdcb1c38f0ab174f12467daf84a0e062a"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.981672 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerStarted","Data":"75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.981749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerStarted","Data":"49d0d8e35c38306e9d9d2a68f113990c44c74cb3b5a7d200ee672f1ee07d5629"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.984036 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:10.991657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" event={"ID":"1a57db04-0dc9-4d63-8d08-dd4309b19496","Type":"ContainerStarted","Data":"6cc3f5f6bfa1471670ea1bda49b78865e22324a8edcd3e45cb92e533fe2a84f1"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.015508 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5b6c75676b-jx6kl" podStartSLOduration=5.015469417 podStartE2EDuration="5.015469417s" podCreationTimestamp="2026-03-13 14:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:11.00249073 +0000 UTC m=+1566.004078989" watchObservedRunningTime="2026-03-13 14:22:11.015469417 +0000 UTC m=+1566.017057656" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024739 4898 generic.go:334] "Generic (PLEG): container finished" podID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerID="cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242" exitCode=0 Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024814 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024851 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86c6c495-884b-4c92-949f-0159eb17e6a5","Type":"ContainerDied","Data":"2b43c112fe6b642ffc81d63835d5208293191491d23e2c20e5ef660540956b7c"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.024866 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b43c112fe6b642ffc81d63835d5208293191491d23e2c20e5ef660540956b7c" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.031389 4898 generic.go:334] "Generic (PLEG): container finished" podID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" exitCode=0 Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.031528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerDied","Data":"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.031558 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerStarted","Data":"c39986f2a0c08da0dd84aef4031cbde75ea3fecd89eca753618403b386b96b49"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.045932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerStarted","Data":"f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.060351 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerStarted","Data":"99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.060998 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.068793 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"fae467477196fd229044597b43b86230463815cebabe0d5642c9fe93f9d884e6"} Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.105717 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.120890 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6b86699784-tf822" podStartSLOduration=12.120863463 podStartE2EDuration="12.120863463s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:11.073636997 +0000 UTC m=+1566.075225246" watchObservedRunningTime="2026-03-13 14:22:11.120863463 +0000 UTC m=+1566.122451702" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237292 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237464 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237555 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.237664 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") pod \"86c6c495-884b-4c92-949f-0159eb17e6a5\" (UID: \"86c6c495-884b-4c92-949f-0159eb17e6a5\") " Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.240107 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.240479 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.242118 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.242598 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86c6c495-884b-4c92-949f-0159eb17e6a5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.252023 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh" (OuterVolumeSpecName: "kube-api-access-8tvsh") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "kube-api-access-8tvsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.256140 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts" (OuterVolumeSpecName: "scripts") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.264340 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.307954 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.324860 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.348923 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.348955 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.348964 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tvsh\" (UniqueName: \"kubernetes.io/projected/86c6c495-884b-4c92-949f-0159eb17e6a5-kube-api-access-8tvsh\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.442477 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.460671 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.507022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data" (OuterVolumeSpecName: "config-data") pod "86c6c495-884b-4c92-949f-0159eb17e6a5" (UID: "86c6c495-884b-4c92-949f-0159eb17e6a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:11 crc kubenswrapper[4898]: I0313 14:22:11.564321 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86c6c495-884b-4c92-949f-0159eb17e6a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.082758 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerStarted","Data":"910b4f0096d337a99b11342bf407ba83fa521765a628b5f816cab829a8a6b279"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.086229 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.087990 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" event={"ID":"1a57db04-0dc9-4d63-8d08-dd4309b19496","Type":"ContainerStarted","Data":"89d1231d67e96e72d5e3c165afd5a70443483d484b8a2a3b799a394c1438df35"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.088180 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.093028 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerStarted","Data":"f8532eef577636e4f0bae3c5d04fb7af834ed690d3b4263f9713e1e00c75cbcb"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.097858 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerStarted","Data":"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.098635 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.104547 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerStarted","Data":"0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a"} Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.105173 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.133217 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" podStartSLOduration=13.133173743 podStartE2EDuration="13.133173743s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:12.116685205 +0000 UTC m=+1567.118273454" watchObservedRunningTime="2026-03-13 14:22:12.133173743 +0000 UTC m=+1567.134761982" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.134966 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" podStartSLOduration=11.158422718 podStartE2EDuration="12.134954869s" podCreationTimestamp="2026-03-13 14:22:00 +0000 UTC" firstStartedPulling="2026-03-13 14:22:10.046693458 +0000 UTC m=+1565.048281697" lastFinishedPulling="2026-03-13 14:22:11.023225609 +0000 UTC m=+1566.024813848" observedRunningTime="2026-03-13 14:22:12.132411583 +0000 UTC m=+1567.133999822" watchObservedRunningTime="2026-03-13 14:22:12.134954869 +0000 UTC m=+1567.136543108" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.167589 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" podStartSLOduration=13.167568006 podStartE2EDuration="13.167568006s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:12.153409488 +0000 UTC m=+1567.154997747" watchObservedRunningTime="2026-03-13 14:22:12.167568006 +0000 UTC m=+1567.169156255" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.212322 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.248112 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.260737 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261295 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261316 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261329 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261336 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261351 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261357 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: E0313 14:22:12.261385 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261390 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261608 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-notification-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261628 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="sg-core" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261639 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="proxy-httpd" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.261652 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" containerName="ceilometer-central-agent" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.263600 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.266798 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.268569 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.275250 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307486 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307598 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307671 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307760 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307787 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.307832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410745 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410958 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.410996 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.411120 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.411257 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.411699 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.412178 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.417811 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.422976 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.426526 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.431542 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.439353 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " pod="openstack/ceilometer-0" Mar 13 14:22:12 crc kubenswrapper[4898]: I0313 14:22:12.637543 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.131727 4898 generic.go:334] "Generic (PLEG): container finished" podID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerID="0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a" exitCode=0 Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.132072 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerDied","Data":"0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a"} Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.132447 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:13 crc kubenswrapper[4898]: I0313 14:22:13.753432 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c6c495-884b-4c92-949f-0159eb17e6a5" path="/var/lib/kubelet/pods/86c6c495-884b-4c92-949f-0159eb17e6a5/volumes" Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.711066 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.806623 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") pod \"3c85cb04-363e-45d6-a14b-79c249e8f469\" (UID: \"3c85cb04-363e-45d6-a14b-79c249e8f469\") " Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.815069 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl" (OuterVolumeSpecName: "kube-api-access-6tmtl") pod "3c85cb04-363e-45d6-a14b-79c249e8f469" (UID: "3c85cb04-363e-45d6-a14b-79c249e8f469"). InnerVolumeSpecName "kube-api-access-6tmtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:14 crc kubenswrapper[4898]: I0313 14:22:14.909629 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tmtl\" (UniqueName: \"kubernetes.io/projected/3c85cb04-363e-45d6-a14b-79c249e8f469-kube-api-access-6tmtl\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.125967 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.181278 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerStarted","Data":"8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.181382 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.216380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.218962 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerStarted","Data":"c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.219113 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" containerID="cri-o://c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9" gracePeriod=60 Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.219288 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.216510 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.236857 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.242115 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.242180 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") pod \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\" (UID: \"61f1f8bf-63eb-464c-9703-3d3db80ba0df\") " Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.262442 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-9b6c99f6d-7zgm5" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" containerID="cri-o://954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd" gracePeriod=60 Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.262613 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerStarted","Data":"954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.262673 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.280135 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.281164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556862-mpx4w" event={"ID":"3c85cb04-363e-45d6-a14b-79c249e8f469","Type":"ContainerDied","Data":"f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.281207 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f45ac4cfc3eacf3c82e4d4e57524fced35ccc73e61bc0b3e80601e75041e8b40" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.283620 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5978dd6d84-pnknr" podStartSLOduration=4.185198646 podStartE2EDuration="9.283604597s" podCreationTimestamp="2026-03-13 14:22:06 +0000 UTC" firstStartedPulling="2026-03-13 14:22:09.395585667 +0000 UTC m=+1564.397173906" lastFinishedPulling="2026-03-13 14:22:14.493991618 +0000 UTC m=+1569.495579857" observedRunningTime="2026-03-13 14:22:15.217288115 +0000 UTC m=+1570.218876354" watchObservedRunningTime="2026-03-13 14:22:15.283604597 +0000 UTC m=+1570.285192826" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294468 4898 generic.go:334] "Generic (PLEG): container finished" podID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" exitCode=0 Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerDied","Data":"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294564 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f97c64464-wmnph" event={"ID":"61f1f8bf-63eb-464c-9703-3d3db80ba0df","Type":"ContainerDied","Data":"0ed18fce368036b02906d995e0994f8f4a7d0e035afb43b24689faf9dc556daf"} Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294588 4898 scope.go:117] "RemoveContainer" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.294803 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f97c64464-wmnph" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.296206 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd" (OuterVolumeSpecName: "kube-api-access-6rskd") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "kube-api-access-6rskd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.313536 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.346319 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.346349 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rskd\" (UniqueName: \"kubernetes.io/projected/61f1f8bf-63eb-464c-9703-3d3db80ba0df-kube-api-access-6rskd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.357795 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.419772 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.450808 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556856-z8p4g"] Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.470994 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" podStartSLOduration=11.430071402 podStartE2EDuration="16.470976701s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="2026-03-13 14:22:09.418209974 +0000 UTC m=+1564.419798213" lastFinishedPulling="2026-03-13 14:22:14.459115273 +0000 UTC m=+1569.460703512" observedRunningTime="2026-03-13 14:22:15.268616537 +0000 UTC m=+1570.270204776" watchObservedRunningTime="2026-03-13 14:22:15.470976701 +0000 UTC m=+1570.472564940" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.491192 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-9b6c99f6d-7zgm5" podStartSLOduration=10.818390143 podStartE2EDuration="16.491169075s" podCreationTimestamp="2026-03-13 14:21:59 +0000 UTC" firstStartedPulling="2026-03-13 14:22:08.786529196 +0000 UTC m=+1563.788117435" lastFinishedPulling="2026-03-13 14:22:14.459308118 +0000 UTC m=+1569.460896367" observedRunningTime="2026-03-13 14:22:15.288497374 +0000 UTC m=+1570.290085613" watchObservedRunningTime="2026-03-13 14:22:15.491169075 +0000 UTC m=+1570.492757314" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.638216 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.657007 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.723650 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config" (OuterVolumeSpecName: "config") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.760235 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.775535 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b81468c-e1ac-4515-837d-993e3c5108c9" path="/var/lib/kubelet/pods/5b81468c-e1ac-4515-837d-993e3c5108c9/volumes" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.933770 4898 scope.go:117] "RemoveContainer" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.941680 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "61f1f8bf-63eb-464c-9703-3d3db80ba0df" (UID: "61f1f8bf-63eb-464c-9703-3d3db80ba0df"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:15 crc kubenswrapper[4898]: I0313 14:22:15.967742 4898 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/61f1f8bf-63eb-464c-9703-3d3db80ba0df-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.022810 4898 scope.go:117] "RemoveContainer" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" Mar 13 14:22:16 crc kubenswrapper[4898]: E0313 14:22:16.023276 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4\": container with ID starting with fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4 not found: ID does not exist" containerID="fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.023319 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4"} err="failed to get container status \"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4\": rpc error: code = NotFound desc = could not find container \"fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4\": container with ID starting with fc322fcfa0d128d1cbcd5fd7cc8972df0d6b4b808a700c1468ceb21cb71602e4 not found: ID does not exist" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.023339 4898 scope.go:117] "RemoveContainer" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" Mar 13 14:22:16 crc kubenswrapper[4898]: E0313 14:22:16.023532 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded\": container with ID starting with 6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded not found: ID does not exist" containerID="6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.023556 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded"} err="failed to get container status \"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded\": rpc error: code = NotFound desc = could not find container \"6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded\": container with ID starting with 6c47c58bce6d5a27ce8f0a9ec972dc3740b57217b054bd621f4319ad64cb9ded not found: ID does not exist" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.309482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerStarted","Data":"357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.309863 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.311599 4898 generic.go:334] "Generic (PLEG): container finished" podID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerID="8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c" exitCode=1 Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.311831 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerDied","Data":"8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.312354 4898 scope.go:117] "RemoveContainer" containerID="8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.315767 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerStarted","Data":"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.315889 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.323442 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerID="ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6" exitCode=1 Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.323496 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerDied","Data":"ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.324309 4898 scope.go:117] "RemoveContainer" containerID="ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.328488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"eb628c158bccb3144900d410778ea134ed3ea0eddc185afc79f0a4381f9e188c"} Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.407542 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5f97b49ff6-67dbr" podStartSLOduration=4.169061834 podStartE2EDuration="7.407520643s" podCreationTimestamp="2026-03-13 14:22:09 +0000 UTC" firstStartedPulling="2026-03-13 14:22:11.289599424 +0000 UTC m=+1566.291187663" lastFinishedPulling="2026-03-13 14:22:14.528058233 +0000 UTC m=+1569.529646472" observedRunningTime="2026-03-13 14:22:16.345602236 +0000 UTC m=+1571.347190475" watchObservedRunningTime="2026-03-13 14:22:16.407520643 +0000 UTC m=+1571.409108882" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.443379 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76b5758c54-vpp67" podStartSLOduration=4.208154039 podStartE2EDuration="7.443145928s" podCreationTimestamp="2026-03-13 14:22:09 +0000 UTC" firstStartedPulling="2026-03-13 14:22:11.315062165 +0000 UTC m=+1566.316650404" lastFinishedPulling="2026-03-13 14:22:14.550054054 +0000 UTC m=+1569.551642293" observedRunningTime="2026-03-13 14:22:16.396313442 +0000 UTC m=+1571.397901691" watchObservedRunningTime="2026-03-13 14:22:16.443145928 +0000 UTC m=+1571.444734167" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.475433 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.486557 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f97c64464-wmnph"] Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.854727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.875998 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:16 crc kubenswrapper[4898]: I0313 14:22:16.876050 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.341495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"715462d9bf65ff0e13b9e6ca02779b8f60f62b7fdf58a8171a568c3f23c25ccb"} Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.344892 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" exitCode=1 Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.345069 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerDied","Data":"ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a"} Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.345107 4898 scope.go:117] "RemoveContainer" containerID="ef85f89a78107532e3b7ef83d63a9d6a9985b54d2525db305b02b87cede351c6" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.345998 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.346251 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.350417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b"} Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.357987 4898 generic.go:334] "Generic (PLEG): container finished" podID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" exitCode=1 Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.359321 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.359510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerDied","Data":"d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3"} Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.359632 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.380934 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.3808735 podStartE2EDuration="8.3808735s" podCreationTimestamp="2026-03-13 14:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:17.37548816 +0000 UTC m=+1572.377076429" watchObservedRunningTime="2026-03-13 14:22:17.3808735 +0000 UTC m=+1572.382461759" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.482270 4898 scope.go:117] "RemoveContainer" containerID="8f11edb4f03060d32d47258f0f26105368d5a72b3961df0611efa132599db56c" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737213 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.737712 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737729 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.737770 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerName="oc" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737776 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerName="oc" Mar 13 14:22:17 crc kubenswrapper[4898]: E0313 14:22:17.737785 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.737790 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738021 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-httpd" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738046 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" containerName="oc" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738059 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" containerName="neutron-api" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.738822 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.757018 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f1f8bf-63eb-464c-9703-3d3db80ba0df" path="/var/lib/kubelet/pods/61f1f8bf-63eb-464c-9703-3d3db80ba0df/volumes" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.760769 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.815941 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.817546 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.842980 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.901258 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.902871 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.912289 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.912679 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924189 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:17 crc kubenswrapper[4898]: I0313 14:22:17.924376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.003375 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.004854 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.012732 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027107 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027127 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027196 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.027326 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.028034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.029488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.050075 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"nova-cell0-db-create-qlgbl\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.052349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"nova-api-db-create-sqtw8\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.063648 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.123252 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.125939 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.128013 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131107 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131241 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131340 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.131409 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.132515 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.141731 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.166689 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.181442 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"nova-api-978d-account-create-update-tkc22\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.229696 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233414 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233465 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.233622 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.238946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.253946 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"nova-cell1-db-create-zhx84\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.320258 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.321747 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.332219 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.337469 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.337621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.360698 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.372521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"nova-cell0-b9c7-account-create-update-l6h97\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.386513 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.394591 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:18 crc kubenswrapper[4898]: E0313 14:22:18.394948 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.401520 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a"} Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.415630 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:18 crc kubenswrapper[4898]: E0313 14:22:18.415834 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.441298 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.441436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.508068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.518293 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.543493 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.543625 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.548154 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.567997 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"nova-cell1-96c4-account-create-update-zsh5t\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:18 crc kubenswrapper[4898]: I0313 14:22:18.832933 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134000 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134054 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134096 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134936 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.134982 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" gracePeriod=600 Mar 13 14:22:19 crc kubenswrapper[4898]: E0313 14:22:19.276945 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.429656 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5"} Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.445958 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" exitCode=0 Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.446056 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc"} Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.446102 4898 scope.go:117] "RemoveContainer" containerID="37bdbe6f1a65f1530746827b4e6d1dd1ce95edb9a913051fc8fca9a782787e56" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.447240 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:19 crc kubenswrapper[4898]: E0313 14:22:19.447723 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.509500 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:22:19 crc kubenswrapper[4898]: W0313 14:22:19.577482 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod068b0856_126d_487c_9c1d_50299bf90d3a.slice/crio-52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84 WatchSource:0}: Error finding container 52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84: Status 404 returned error can't find the container with id 52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84 Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.579561 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.599196 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.608250 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.621497 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.625781 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.649347 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f9cbdc5df-5tx5z" Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.671970 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.804289 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:22:19 crc kubenswrapper[4898]: I0313 14:22:19.984022 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.093773 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.094977 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" containerID="cri-o://4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3" gracePeriod=10 Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.211874 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.482145 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqtw8" event={"ID":"068b0856-126d-487c-9c1d-50299bf90d3a","Type":"ContainerStarted","Data":"52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.543100 4898 generic.go:334] "Generic (PLEG): container finished" podID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerID="4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3" exitCode=0 Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.543438 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerDied","Data":"4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.550683 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qlgbl" event={"ID":"44f1f531-99d1-4b97-bd08-6bf94a7afd92","Type":"ContainerStarted","Data":"c3f324e654c4694796399f736ba82201f21f4c72e8656d1a41457de581a31d5e"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.563970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerStarted","Data":"7e15455495730456e7a8bff672640fcee90a6ae705923706dc2aa854e8ab7ed5"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.572651 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerStarted","Data":"50f9054e56407ba0a9fea287b30973d6add7ed954d891fc3616c3d5b3283065f"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.584701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-978d-account-create-update-tkc22" event={"ID":"29dbeb8a-611d-4513-a063-06d8f865ea93","Type":"ContainerStarted","Data":"26bae5e708003894bf328daf1be0e3e64680a6bf96fbf9b626133d828b1312af"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.595492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerStarted","Data":"f9ad8ed01a8a4a46b06ba41d288cebad2344ae1d2a57232e2dd62e53b5ce8da8"} Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.621905 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-978d-account-create-update-tkc22" podStartSLOduration=3.621850455 podStartE2EDuration="3.621850455s" podCreationTimestamp="2026-03-13 14:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:20.61816493 +0000 UTC m=+1575.619753169" watchObservedRunningTime="2026-03-13 14:22:20.621850455 +0000 UTC m=+1575.623438704" Mar 13 14:22:20 crc kubenswrapper[4898]: I0313 14:22:20.657069 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:20 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:20 crc kubenswrapper[4898]: > Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.232218 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384231 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384356 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.384653 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") pod \"99ea68d3-f555-4779-90d0-d1f136ddadd2\" (UID: \"99ea68d3-f555-4779-90d0-d1f136ddadd2\") " Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.421751 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4" (OuterVolumeSpecName: "kube-api-access-ktnr4") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "kube-api-access-ktnr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.491691 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktnr4\" (UniqueName: \"kubernetes.io/projected/99ea68d3-f555-4779-90d0-d1f136ddadd2-kube-api-access-ktnr4\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.519303 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.521866 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.529209 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.575613 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config" (OuterVolumeSpecName: "config") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594074 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594113 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594124 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.594133 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.605930 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99ea68d3-f555-4779-90d0-d1f136ddadd2" (UID: "99ea68d3-f555-4779-90d0-d1f136ddadd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.610760 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" event={"ID":"99ea68d3-f555-4779-90d0-d1f136ddadd2","Type":"ContainerDied","Data":"2f6cf6b2237006a47af92c80edb293fb5e39aa92cbe683d435727b4ad4952d2e"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.610829 4898 scope.go:117] "RemoveContainer" containerID="4a5c75faeafd5fd73d57b281c113bb58d89f329bfae70b866f45093f1de113f3" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.611009 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gtnnh" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.632069 4898 generic.go:334] "Generic (PLEG): container finished" podID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerID="8dc6d76d86edf4ca1fbe4589c8cd285e45e3e3a632208ca7817a0099916d678e" exitCode=0 Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.632141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qlgbl" event={"ID":"44f1f531-99d1-4b97-bd08-6bf94a7afd92","Type":"ContainerDied","Data":"8dc6d76d86edf4ca1fbe4589c8cd285e45e3e3a632208ca7817a0099916d678e"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.642709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerStarted","Data":"388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.673578 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" podStartSLOduration=3.673562187 podStartE2EDuration="3.673562187s" podCreationTimestamp="2026-03-13 14:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:21.66600129 +0000 UTC m=+1576.667589529" watchObservedRunningTime="2026-03-13 14:22:21.673562187 +0000 UTC m=+1576.675150426" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.676197 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerStarted","Data":"990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.697229 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-zhx84" podStartSLOduration=4.69721144 podStartE2EDuration="4.69721144s" podCreationTimestamp="2026-03-13 14:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:21.6917927 +0000 UTC m=+1576.693380939" watchObservedRunningTime="2026-03-13 14:22:21.69721144 +0000 UTC m=+1576.698799679" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.698155 4898 generic.go:334] "Generic (PLEG): container finished" podID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerID="7b608b8ed7e8f0c428a761cc552755850cceca5a0b694b7beffed50aa397bd7c" exitCode=0 Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.698228 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-978d-account-create-update-tkc22" event={"ID":"29dbeb8a-611d-4513-a063-06d8f865ea93","Type":"ContainerDied","Data":"7b608b8ed7e8f0c428a761cc552755850cceca5a0b694b7beffed50aa397bd7c"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.702343 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99ea68d3-f555-4779-90d0-d1f136ddadd2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.719822 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerStarted","Data":"f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.719871 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.741663 4898 scope.go:117] "RemoveContainer" containerID="6c42ee9c0a17acfdf5d9f3b6de5ee36bb640854b185b6dd5e7f1e7441cc93008" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.742343 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerStarted","Data":"bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.758410 4898 generic.go:334] "Generic (PLEG): container finished" podID="068b0856-126d-487c-9c1d-50299bf90d3a" containerID="bf99d1df7658057773b414b4c2a04b114eb5afc6efdb7e3669f974780f737076" exitCode=0 Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.764549 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqtw8" event={"ID":"068b0856-126d-487c-9c1d-50299bf90d3a","Type":"ContainerDied","Data":"bf99d1df7658057773b414b4c2a04b114eb5afc6efdb7e3669f974780f737076"} Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.785099 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.818594 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gtnnh"] Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.847942 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.405013268 podStartE2EDuration="9.847889222s" podCreationTimestamp="2026-03-13 14:22:12 +0000 UTC" firstStartedPulling="2026-03-13 14:22:15.335444402 +0000 UTC m=+1570.337032641" lastFinishedPulling="2026-03-13 14:22:20.778320356 +0000 UTC m=+1575.779908595" observedRunningTime="2026-03-13 14:22:21.759111107 +0000 UTC m=+1576.760699356" watchObservedRunningTime="2026-03-13 14:22:21.847889222 +0000 UTC m=+1576.849477461" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.851057 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.851875 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:21 crc kubenswrapper[4898]: E0313 14:22:21.852131 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.852579 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.867243 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" podStartSLOduration=3.8672173340000002 podStartE2EDuration="3.867217334s" podCreationTimestamp="2026-03-13 14:22:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:21.784147837 +0000 UTC m=+1576.785736076" watchObservedRunningTime="2026-03-13 14:22:21.867217334 +0000 UTC m=+1576.868805593" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.873164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.873264 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:21 crc kubenswrapper[4898]: I0313 14:22:21.874229 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:21 crc kubenswrapper[4898]: E0313 14:22:21.874568 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.439875 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.626436 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.637731 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.646650 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.714890 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.719209 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.780602 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.783406 4898 generic.go:334] "Generic (PLEG): container finished" podID="485200a5-cd75-45ac-b93a-b003158132c4" containerID="388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3" exitCode=0 Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.783481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerDied","Data":"388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3"} Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.788074 4898 generic.go:334] "Generic (PLEG): container finished" podID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerID="990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f" exitCode=0 Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.788158 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerDied","Data":"990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f"} Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.799180 4898 generic.go:334] "Generic (PLEG): container finished" podID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerID="bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8" exitCode=0 Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.799230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerDied","Data":"bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8"} Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.800628 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:22 crc kubenswrapper[4898]: E0313 14:22:22.800832 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76545f46cd-qk7nm_openstack(8c6a61ba-babd-4bc2-922a-99b00c2af057)\"" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" Mar 13 14:22:22 crc kubenswrapper[4898]: I0313 14:22:22.804407 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:22 crc kubenswrapper[4898]: E0313 14:22:22.804822 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5978dd6d84-pnknr_openstack(6f42d66e-f331-4c05-a4fb-d6208b4493fb)\"" pod="openstack/heat-api-5978dd6d84-pnknr" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.360221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.457835 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") pod \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.463049 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44f1f531-99d1-4b97-bd08-6bf94a7afd92" (UID: "44f1f531-99d1-4b97-bd08-6bf94a7afd92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.463285 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") pod \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\" (UID: \"44f1f531-99d1-4b97-bd08-6bf94a7afd92\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.474548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2" (OuterVolumeSpecName: "kube-api-access-m4bb2") pod "44f1f531-99d1-4b97-bd08-6bf94a7afd92" (UID: "44f1f531-99d1-4b97-bd08-6bf94a7afd92"). InnerVolumeSpecName "kube-api-access-m4bb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.477221 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1f531-99d1-4b97-bd08-6bf94a7afd92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.580882 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4bb2\" (UniqueName: \"kubernetes.io/projected/44f1f531-99d1-4b97-bd08-6bf94a7afd92-kube-api-access-m4bb2\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.607882 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.622524 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.682806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") pod \"068b0856-126d-487c-9c1d-50299bf90d3a\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.683076 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") pod \"068b0856-126d-487c-9c1d-50299bf90d3a\" (UID: \"068b0856-126d-487c-9c1d-50299bf90d3a\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.684295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "068b0856-126d-487c-9c1d-50299bf90d3a" (UID: "068b0856-126d-487c-9c1d-50299bf90d3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.697715 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd" (OuterVolumeSpecName: "kube-api-access-nw6kd") pod "068b0856-126d-487c-9c1d-50299bf90d3a" (UID: "068b0856-126d-487c-9c1d-50299bf90d3a"). InnerVolumeSpecName "kube-api-access-nw6kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.754555 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" path="/var/lib/kubelet/pods/99ea68d3-f555-4779-90d0-d1f136ddadd2/volumes" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.785238 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") pod \"29dbeb8a-611d-4513-a063-06d8f865ea93\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.785574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") pod \"29dbeb8a-611d-4513-a063-06d8f865ea93\" (UID: \"29dbeb8a-611d-4513-a063-06d8f865ea93\") " Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.786158 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/068b0856-126d-487c-9c1d-50299bf90d3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.786176 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw6kd\" (UniqueName: \"kubernetes.io/projected/068b0856-126d-487c-9c1d-50299bf90d3a-kube-api-access-nw6kd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.786439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29dbeb8a-611d-4513-a063-06d8f865ea93" (UID: "29dbeb8a-611d-4513-a063-06d8f865ea93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.798758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x" (OuterVolumeSpecName: "kube-api-access-bkf6x") pod "29dbeb8a-611d-4513-a063-06d8f865ea93" (UID: "29dbeb8a-611d-4513-a063-06d8f865ea93"). InnerVolumeSpecName "kube-api-access-bkf6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.812447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qlgbl" event={"ID":"44f1f531-99d1-4b97-bd08-6bf94a7afd92","Type":"ContainerDied","Data":"c3f324e654c4694796399f736ba82201f21f4c72e8656d1a41457de581a31d5e"} Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.812493 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f324e654c4694796399f736ba82201f21f4c72e8656d1a41457de581a31d5e" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.812748 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qlgbl" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.814682 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-978d-account-create-update-tkc22" event={"ID":"29dbeb8a-611d-4513-a063-06d8f865ea93","Type":"ContainerDied","Data":"26bae5e708003894bf328daf1be0e3e64680a6bf96fbf9b626133d828b1312af"} Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.814713 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26bae5e708003894bf328daf1be0e3e64680a6bf96fbf9b626133d828b1312af" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.814771 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-978d-account-create-update-tkc22" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.816850 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-sqtw8" event={"ID":"068b0856-126d-487c-9c1d-50299bf90d3a","Type":"ContainerDied","Data":"52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84"} Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.816935 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a2cbda59b7d4003f96766e7ac8026a8135f15213a313db58524fcdf356ad84" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.817025 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-sqtw8" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820028 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" containerID="cri-o://8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820326 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" containerID="cri-o://f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820447 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" containerID="cri-o://003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.820670 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" containerID="cri-o://d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a" gracePeriod=30 Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.888160 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29dbeb8a-611d-4513-a063-06d8f865ea93-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:23 crc kubenswrapper[4898]: I0313 14:22:23.890374 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkf6x\" (UniqueName: \"kubernetes.io/projected/29dbeb8a-611d-4513-a063-06d8f865ea93-kube-api-access-bkf6x\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.159285 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.303431 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") pod \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.303477 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") pod \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\" (UID: \"e516311e-fb5c-4901-aaf7-67793ffb5fa2\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.304963 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e516311e-fb5c-4901-aaf7-67793ffb5fa2" (UID: "e516311e-fb5c-4901-aaf7-67793ffb5fa2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.311643 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4" (OuterVolumeSpecName: "kube-api-access-mzzw4") pod "e516311e-fb5c-4901-aaf7-67793ffb5fa2" (UID: "e516311e-fb5c-4901-aaf7-67793ffb5fa2"). InnerVolumeSpecName "kube-api-access-mzzw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.407621 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzzw4\" (UniqueName: \"kubernetes.io/projected/e516311e-fb5c-4901-aaf7-67793ffb5fa2-kube-api-access-mzzw4\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.407653 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e516311e-fb5c-4901-aaf7-67793ffb5fa2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.690718 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.699785 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.724007 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.736055 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826351 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") pod \"485200a5-cd75-45ac-b93a-b003158132c4\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826472 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") pod \"1abedb18-bf27-42d9-b809-f7226b603a0d\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826568 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826721 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826817 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826879 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") pod \"485200a5-cd75-45ac-b93a-b003158132c4\" (UID: \"485200a5-cd75-45ac-b93a-b003158132c4\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.826948 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827030 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827153 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") pod \"8c6a61ba-babd-4bc2-922a-99b00c2af057\" (UID: \"8c6a61ba-babd-4bc2-922a-99b00c2af057\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827187 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") pod \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\" (UID: \"6f42d66e-f331-4c05-a4fb-d6208b4493fb\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.827245 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") pod \"1abedb18-bf27-42d9-b809-f7226b603a0d\" (UID: \"1abedb18-bf27-42d9-b809-f7226b603a0d\") " Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.828885 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "485200a5-cd75-45ac-b93a-b003158132c4" (UID: "485200a5-cd75-45ac-b93a-b003158132c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.839398 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq" (OuterVolumeSpecName: "kube-api-access-jbntq") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "kube-api-access-jbntq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.839476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j" (OuterVolumeSpecName: "kube-api-access-p272j") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "kube-api-access-p272j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.839495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j" (OuterVolumeSpecName: "kube-api-access-s8s4j") pod "1abedb18-bf27-42d9-b809-f7226b603a0d" (UID: "1abedb18-bf27-42d9-b809-f7226b603a0d"). InnerVolumeSpecName "kube-api-access-s8s4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.840392 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1abedb18-bf27-42d9-b809-f7226b603a0d" (UID: "1abedb18-bf27-42d9-b809-f7226b603a0d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.841434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.842382 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl" (OuterVolumeSpecName: "kube-api-access-w42dl") pod "485200a5-cd75-45ac-b93a-b003158132c4" (UID: "485200a5-cd75-45ac-b93a-b003158132c4"). InnerVolumeSpecName "kube-api-access-w42dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.844505 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.850641 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-zhx84" event={"ID":"1abedb18-bf27-42d9-b809-f7226b603a0d","Type":"ContainerDied","Data":"50f9054e56407ba0a9fea287b30973d6add7ed954d891fc3616c3d5b3283065f"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.850689 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f9054e56407ba0a9fea287b30973d6add7ed954d891fc3616c3d5b3283065f" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.850749 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-zhx84" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.854067 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" event={"ID":"8c6a61ba-babd-4bc2-922a-99b00c2af057","Type":"ContainerDied","Data":"703728a9002cd85f33faecdfc398f12cdcb1c38f0ab174f12467daf84a0e062a"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.854327 4898 scope.go:117] "RemoveContainer" containerID="ee395bb97804b9c181c98691f0ac177fe6c008f2b77b6a0f5cb7a636bfe0789a" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.854512 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76545f46cd-qk7nm" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863406 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07" exitCode=0 Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863431 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5" exitCode=2 Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863438 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a" exitCode=0 Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863523 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.863533 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.865652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" event={"ID":"e516311e-fb5c-4901-aaf7-67793ffb5fa2","Type":"ContainerDied","Data":"f9ad8ed01a8a4a46b06ba41d288cebad2344ae1d2a57232e2dd62e53b5ce8da8"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.865718 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ad8ed01a8a4a46b06ba41d288cebad2344ae1d2a57232e2dd62e53b5ce8da8" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.865762 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-b9c7-account-create-update-l6h97" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.880759 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5978dd6d84-pnknr" event={"ID":"6f42d66e-f331-4c05-a4fb-d6208b4493fb","Type":"ContainerDied","Data":"83eeb629576ceac1e4c3211f16c303bcac0592684b9a1724a45b87fae4f69938"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.880878 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5978dd6d84-pnknr" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.884468 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.890485 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.890812 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" event={"ID":"485200a5-cd75-45ac-b93a-b003158132c4","Type":"ContainerDied","Data":"7e15455495730456e7a8bff672640fcee90a6ae705923706dc2aa854e8ab7ed5"} Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.890851 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e15455495730456e7a8bff672640fcee90a6ae705923706dc2aa854e8ab7ed5" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.891091 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-96c4-account-create-update-zsh5t" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.904117 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data" (OuterVolumeSpecName: "config-data") pod "6f42d66e-f331-4c05-a4fb-d6208b4493fb" (UID: "6f42d66e-f331-4c05-a4fb-d6208b4493fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.905557 4898 scope.go:117] "RemoveContainer" containerID="d08fae9298b9b14b84a2c4d5726d6db0f8d8453e2834240ede2ff7d43ec4ebb3" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932408 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932450 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abedb18-bf27-42d9-b809-f7226b603a0d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932465 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p272j\" (UniqueName: \"kubernetes.io/projected/8c6a61ba-babd-4bc2-922a-99b00c2af057-kube-api-access-p272j\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932484 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w42dl\" (UniqueName: \"kubernetes.io/projected/485200a5-cd75-45ac-b93a-b003158132c4-kube-api-access-w42dl\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932504 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8s4j\" (UniqueName: \"kubernetes.io/projected/1abedb18-bf27-42d9-b809-f7226b603a0d-kube-api-access-s8s4j\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932515 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932527 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932539 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932557 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485200a5-cd75-45ac-b93a-b003158132c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932569 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6f42d66e-f331-4c05-a4fb-d6208b4493fb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.932582 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbntq\" (UniqueName: \"kubernetes.io/projected/6f42d66e-f331-4c05-a4fb-d6208b4493fb-kube-api-access-jbntq\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:24 crc kubenswrapper[4898]: I0313 14:22:24.958644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data" (OuterVolumeSpecName: "config-data") pod "8c6a61ba-babd-4bc2-922a-99b00c2af057" (UID: "8c6a61ba-babd-4bc2-922a-99b00c2af057"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.035202 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c6a61ba-babd-4bc2-922a-99b00c2af057-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.208955 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.229976 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76545f46cd-qk7nm"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.279054 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.306688 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5978dd6d84-pnknr"] Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.753046 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" path="/var/lib/kubelet/pods/6f42d66e-f331-4c05-a4fb-d6208b4493fb/volumes" Mar 13 14:22:25 crc kubenswrapper[4898]: I0313 14:22:25.753670 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" path="/var/lib/kubelet/pods/8c6a61ba-babd-4bc2-922a-99b00c2af057/volumes" Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.859746 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.913281 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.913497 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6b86699784-tf822" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" containerID="cri-o://99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" gracePeriod=60 Mar 13 14:22:26 crc kubenswrapper[4898]: I0313 14:22:26.956882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.396303 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397057 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485200a5-cd75-45ac-b93a-b003158132c4" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397072 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="485200a5-cd75-45ac-b93a-b003158132c4" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397094 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397100 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397112 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397118 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397134 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397140 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397156 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397162 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397170 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397176 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397184 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="init" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397190 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="init" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397204 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397209 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397226 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397232 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: E0313 14:22:28.397243 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397249 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397440 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="485200a5-cd75-45ac-b93a-b003158132c4" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397454 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ea68d3-f555-4779-90d0-d1f136ddadd2" containerName="dnsmasq-dns" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397465 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397475 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397486 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397500 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397510 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397521 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397534 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" containerName="mariadb-database-create" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397543 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.397548 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" containerName="mariadb-account-create-update" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.398393 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.404342 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.404407 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.404483 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-42bds" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.437320 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508081 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508393 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.508457 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610526 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.610644 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.617799 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.618295 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.632038 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.641825 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"nova-cell0-conductor-db-sync-qps2v\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.717587 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.963921 4898 generic.go:334] "Generic (PLEG): container finished" podID="e251995e-609a-4f0e-83f3-7f856e58a598" containerID="8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b" exitCode=0 Mar 13 14:22:28 crc kubenswrapper[4898]: I0313 14:22:28.964230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b"} Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.234938 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.329652 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.329869 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.329892 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330038 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330326 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330478 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.330494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") pod \"e251995e-609a-4f0e-83f3-7f856e58a598\" (UID: \"e251995e-609a-4f0e-83f3-7f856e58a598\") " Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.331439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.332299 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.332326 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e251995e-609a-4f0e-83f3-7f856e58a598-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.344564 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts" (OuterVolumeSpecName: "scripts") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.344942 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t" (OuterVolumeSpecName: "kube-api-access-hjr2t") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "kube-api-access-hjr2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.357442 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:22:29 crc kubenswrapper[4898]: W0313 14:22:29.364069 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eba407c_68a5_45e9_ab51_e8cba05d8559.slice/crio-ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d WatchSource:0}: Error finding container ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d: Status 404 returned error can't find the container with id ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.379362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.435946 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjr2t\" (UniqueName: \"kubernetes.io/projected/e251995e-609a-4f0e-83f3-7f856e58a598-kube-api-access-hjr2t\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.435990 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.436003 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.450883 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.506589 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data" (OuterVolumeSpecName: "config-data") pod "e251995e-609a-4f0e-83f3-7f856e58a598" (UID: "e251995e-609a-4f0e-83f3-7f856e58a598"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.538683 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.538728 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e251995e-609a-4f0e-83f3-7f856e58a598-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.711239 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.713199 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.714482 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:22:29 crc kubenswrapper[4898]: E0313 14:22:29.714512 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6b86699784-tf822" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.976382 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e251995e-609a-4f0e-83f3-7f856e58a598","Type":"ContainerDied","Data":"eb628c158bccb3144900d410778ea134ed3ea0eddc185afc79f0a4381f9e188c"} Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.976435 4898 scope.go:117] "RemoveContainer" containerID="f8d1b2554e005cc4ebc5dedc934e5b72d453f242bad5cf70707745dc2f5c1c07" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.976575 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:29 crc kubenswrapper[4898]: I0313 14:22:29.978344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerStarted","Data":"ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d"} Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.001424 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.011989 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.012677 4898 scope.go:117] "RemoveContainer" containerID="003d77a3bae8b5d30451a8b2b210d256e8f645cab42759f96a2cca43d38b49a5" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.030949 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031469 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031490 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031510 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031516 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a61ba-babd-4bc2-922a-99b00c2af057" containerName="heat-cfnapi" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031525 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031531 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031542 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031547 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031564 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031570 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f42d66e-f331-4c05-a4fb-d6208b4493fb" containerName="heat-api" Mar 13 14:22:30 crc kubenswrapper[4898]: E0313 14:22:30.031588 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031594 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031791 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-notification-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031808 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="proxy-httpd" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031826 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="sg-core" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.031844 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" containerName="ceilometer-central-agent" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.033918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.037961 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.038171 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.041039 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.068938 4898 scope.go:117] "RemoveContainer" containerID="d8d1f76b83bf115de46d0b110c2cf03b3ebde454f4675c453b288cf8a3d3f58a" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.109265 4898 scope.go:117] "RemoveContainer" containerID="8aff6294a407bae6a2eb1e2dc4f0f935834538560ddb004e22ea984aea78200b" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164085 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164376 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164506 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164552 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.164961 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.165204 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.165942 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268261 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268603 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268629 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268656 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268681 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.268750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.269199 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.269450 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.274307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.274357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.276164 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.291520 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.305964 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"ceilometer-0\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.357435 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:30 crc kubenswrapper[4898]: I0313 14:22:30.581602 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:30 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:30 crc kubenswrapper[4898]: > Mar 13 14:22:31 crc kubenswrapper[4898]: I0313 14:22:31.061474 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:31 crc kubenswrapper[4898]: I0313 14:22:31.743953 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:31 crc kubenswrapper[4898]: E0313 14:22:31.744600 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:31 crc kubenswrapper[4898]: I0313 14:22:31.774031 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e251995e-609a-4f0e-83f3-7f856e58a598" path="/var/lib/kubelet/pods/e251995e-609a-4f0e-83f3-7f856e58a598/volumes" Mar 13 14:22:32 crc kubenswrapper[4898]: I0313 14:22:32.055256 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f"} Mar 13 14:22:32 crc kubenswrapper[4898]: I0313 14:22:32.055305 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"cfca4a4c856812d6447387ed63f92e7c2d0804ab4a50cac7b00ec3d059ab8f3a"} Mar 13 14:22:33 crc kubenswrapper[4898]: I0313 14:22:33.073161 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d"} Mar 13 14:22:33 crc kubenswrapper[4898]: I0313 14:22:33.137785 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:33 crc kubenswrapper[4898]: I0313 14:22:33.463126 4898 scope.go:117] "RemoveContainer" containerID="309417dd12bdceaad7cc8574de946b3ecc5729e4fa9390a27c026042338454ac" Mar 13 14:22:34 crc kubenswrapper[4898]: I0313 14:22:34.101657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91"} Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.136335 4898 generic.go:334] "Generic (PLEG): container finished" podID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" exitCode=0 Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.136569 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerDied","Data":"99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9"} Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.466477 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.546849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.547059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.547095 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.547154 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") pod \"88ab3ad2-782a-4c21-8104-1b80468dbca0\" (UID: \"88ab3ad2-782a-4c21-8104-1b80468dbca0\") " Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.555674 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.573618 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w" (OuterVolumeSpecName: "kube-api-access-snc8w") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "kube-api-access-snc8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.599323 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.650225 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.650262 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snc8w\" (UniqueName: \"kubernetes.io/projected/88ab3ad2-782a-4c21-8104-1b80468dbca0-kube-api-access-snc8w\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.650278 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.665135 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data" (OuterVolumeSpecName: "config-data") pod "88ab3ad2-782a-4c21-8104-1b80468dbca0" (UID: "88ab3ad2-782a-4c21-8104-1b80468dbca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:35 crc kubenswrapper[4898]: I0313 14:22:35.752600 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88ab3ad2-782a-4c21-8104-1b80468dbca0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169440 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerStarted","Data":"30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d"} Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169778 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169597 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" containerID="cri-o://162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169516 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" containerID="cri-o://1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169628 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" containerID="cri-o://644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.169614 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" containerID="cri-o://30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d" gracePeriod=30 Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.182577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b86699784-tf822" event={"ID":"88ab3ad2-782a-4c21-8104-1b80468dbca0","Type":"ContainerDied","Data":"92091403be154312d0e01cc88ae4975c8ee84b62f142756e9f0eee0701f6969b"} Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.182631 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b86699784-tf822" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.182679 4898 scope.go:117] "RemoveContainer" containerID="99ddc8edc7229de6f1b448d188e98b54d68182339989edb36f76125f198ea2d9" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.210312 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.532069948 podStartE2EDuration="6.210288113s" podCreationTimestamp="2026-03-13 14:22:30 +0000 UTC" firstStartedPulling="2026-03-13 14:22:31.094054539 +0000 UTC m=+1586.095642778" lastFinishedPulling="2026-03-13 14:22:34.772272704 +0000 UTC m=+1589.773860943" observedRunningTime="2026-03-13 14:22:36.195564161 +0000 UTC m=+1591.197152400" watchObservedRunningTime="2026-03-13 14:22:36.210288113 +0000 UTC m=+1591.211876352" Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.241773 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:36 crc kubenswrapper[4898]: I0313 14:22:36.258418 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6b86699784-tf822"] Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200130 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d" exitCode=0 Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200369 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91" exitCode=2 Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200377 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d" exitCode=0 Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200227 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d"} Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91"} Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.200424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d"} Mar 13 14:22:37 crc kubenswrapper[4898]: I0313 14:22:37.767809 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" path="/var/lib/kubelet/pods/88ab3ad2-782a-4c21-8104-1b80468dbca0/volumes" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.231444 4898 generic.go:334] "Generic (PLEG): container finished" podID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerID="1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f" exitCode=0 Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.231619 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f"} Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.503660 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:22:39 crc kubenswrapper[4898]: E0313 14:22:39.504375 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.504397 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.504698 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ab3ad2-782a-4c21-8104-1b80468dbca0" containerName="heat-engine" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.506935 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.521933 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.591262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.591332 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.591518 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693358 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693421 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.693916 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.694070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.718663 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"redhat-marketplace-sgxj9\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:39 crc kubenswrapper[4898]: I0313 14:22:39.856783 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.056990 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.057210 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" containerID="cri-o://3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e" gracePeriod=30 Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.057926 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" containerID="cri-o://53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a" gracePeriod=30 Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.275303 4898 generic.go:334] "Generic (PLEG): container finished" podID="f772c247-f65b-4185-9c75-25d5894ada70" containerID="3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e" exitCode=143 Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.275347 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerDied","Data":"3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e"} Mar 13 14:22:40 crc kubenswrapper[4898]: I0313 14:22:40.595608 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" probeResult="failure" output=< Mar 13 14:22:40 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:22:40 crc kubenswrapper[4898]: > Mar 13 14:22:41 crc kubenswrapper[4898]: I0313 14:22:41.862556 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:41 crc kubenswrapper[4898]: I0313 14:22:41.863122 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" containerID="cri-o://e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d" gracePeriod=30 Mar 13 14:22:41 crc kubenswrapper[4898]: I0313 14:22:41.863657 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" containerID="cri-o://58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f" gracePeriod=30 Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.310761 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerStarted","Data":"d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371"} Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.321665 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerID="e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d" exitCode=143 Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.321708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerDied","Data":"e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d"} Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.343530 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qps2v" podStartSLOduration=1.68697618 podStartE2EDuration="14.343514068s" podCreationTimestamp="2026-03-13 14:22:28 +0000 UTC" firstStartedPulling="2026-03-13 14:22:29.366408989 +0000 UTC m=+1584.367997228" lastFinishedPulling="2026-03-13 14:22:42.022946877 +0000 UTC m=+1597.024535116" observedRunningTime="2026-03-13 14:22:42.326293561 +0000 UTC m=+1597.327881810" watchObservedRunningTime="2026-03-13 14:22:42.343514068 +0000 UTC m=+1597.345102307" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.351564 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.467849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.468958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469024 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469053 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469306 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469376 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") pod \"1458e3e5-908c-4abc-8b47-2b9d08b95100\" (UID: \"1458e3e5-908c-4abc-8b47-2b9d08b95100\") " Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.469831 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.470234 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.470522 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.470539 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1458e3e5-908c-4abc-8b47-2b9d08b95100-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.475466 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv" (OuterVolumeSpecName: "kube-api-access-bx9nv") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "kube-api-access-bx9nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.482045 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts" (OuterVolumeSpecName: "scripts") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.549046 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.580293 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx9nv\" (UniqueName: \"kubernetes.io/projected/1458e3e5-908c-4abc-8b47-2b9d08b95100-kube-api-access-bx9nv\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.580353 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.580368 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.586007 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.676532 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.683609 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.701161 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data" (OuterVolumeSpecName: "config-data") pod "1458e3e5-908c-4abc-8b47-2b9d08b95100" (UID: "1458e3e5-908c-4abc-8b47-2b9d08b95100"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.740732 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:42 crc kubenswrapper[4898]: E0313 14:22:42.741037 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:42 crc kubenswrapper[4898]: I0313 14:22:42.785878 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1458e3e5-908c-4abc-8b47-2b9d08b95100-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.333336 4898 generic.go:334] "Generic (PLEG): container finished" podID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" exitCode=0 Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.333624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.333652 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerStarted","Data":"c4f29c1c84db31109e4a65fbd939345ced5fa3e54cbf69234fc3c9374e91653f"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.337400 4898 generic.go:334] "Generic (PLEG): container finished" podID="f772c247-f65b-4185-9c75-25d5894ada70" containerID="53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a" exitCode=0 Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.337478 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerDied","Data":"53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.342377 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1458e3e5-908c-4abc-8b47-2b9d08b95100","Type":"ContainerDied","Data":"cfca4a4c856812d6447387ed63f92e7c2d0804ab4a50cac7b00ec3d059ab8f3a"} Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.342457 4898 scope.go:117] "RemoveContainer" containerID="30cd19706b9b763b0b2be58e832d11d7fea738624c0288b8cdeff1b0f4c4df2d" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.342478 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.390504 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.391030 4898 scope.go:117] "RemoveContainer" containerID="162b1332b5d9a0da2866acb2884fa7bbe465eb4a392a7cf47768395600f46a91" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.404096 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430028 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430515 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430538 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430583 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430595 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430601 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: E0313 14:22:43.430621 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430627 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430821 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="sg-core" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430842 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-notification-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430853 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="proxy-httpd" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.430864 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" containerName="ceilometer-central-agent" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.435381 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.437657 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.438064 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.451336 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.467015 4898 scope.go:117] "RemoveContainer" containerID="644027345fec83e47e3898f7dbe5b27fcbb0b059335c0f250370318d012d965d" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500485 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500522 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500568 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500624 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500669 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.500832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.507635 4898 scope.go:117] "RemoveContainer" containerID="1972eec74b79c5eba360234d9352547cca46cb7a3159697f368a323ef349b70f" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.602958 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603238 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603260 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603288 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603422 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.603577 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.604358 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.604673 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.611932 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.612959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.621413 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.623815 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.635003 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"ceilometer-0\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.764815 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1458e3e5-908c-4abc-8b47-2b9d08b95100" path="/var/lib/kubelet/pods/1458e3e5-908c-4abc-8b47-2b9d08b95100/volumes" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.767526 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:43 crc kubenswrapper[4898]: I0313 14:22:43.892290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021132 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021282 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021346 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021457 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021572 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021631 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.021665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") pod \"f772c247-f65b-4185-9c75-25d5894ada70\" (UID: \"f772c247-f65b-4185-9c75-25d5894ada70\") " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.023103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs" (OuterVolumeSpecName: "logs") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.023463 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.031158 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts" (OuterVolumeSpecName: "scripts") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.045176 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb" (OuterVolumeSpecName: "kube-api-access-4wpcb") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "kube-api-access-4wpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.068317 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (OuterVolumeSpecName: "glance") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.081730 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.122917 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124650 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wpcb\" (UniqueName: \"kubernetes.io/projected/f772c247-f65b-4185-9c75-25d5894ada70-kube-api-access-4wpcb\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124685 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124697 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124729 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" " Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124741 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f772c247-f65b-4185-9c75-25d5894ada70-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124751 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.124760 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.146026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data" (OuterVolumeSpecName: "config-data") pod "f772c247-f65b-4185-9c75-25d5894ada70" (UID: "f772c247-f65b-4185-9c75-25d5894ada70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.165174 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.165702 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-17b3b094-1a55-406a-a787-e0abb588e5b7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7") on node "crc" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.227091 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.227128 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f772c247-f65b-4185-9c75-25d5894ada70-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.340529 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.353661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"bdcbd848029858f4c387dbe27ce9e5d245b65833296202afd31da1583743aa4c"} Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.356542 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f772c247-f65b-4185-9c75-25d5894ada70","Type":"ContainerDied","Data":"ddd1664b14e1ff4c8657d63bc705f6e2cc8530fd54bcfec783c314238117e1e0"} Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.356590 4898 scope.go:117] "RemoveContainer" containerID="53ad13c5a81b4a9991a57956cff297d785da3080bb5eafedd89860a32e28cd6a" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.356698 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.384199 4898 scope.go:117] "RemoveContainer" containerID="3ed25c8fea8488646787dd34274ccdeef192849b7dfb335966843f92351f741e" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.405486 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.419820 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.442375 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: E0313 14:22:44.492693 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.497194 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" Mar 13 14:22:44 crc kubenswrapper[4898]: E0313 14:22:44.497467 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.497521 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.498877 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-log" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.499040 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f772c247-f65b-4185-9c75-25d5894ada70" containerName="glance-httpd" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.503941 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.504243 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.506170 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.507498 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.580364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.667705 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2c5\" (UniqueName: \"kubernetes.io/projected/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-kube-api-access-7l2c5\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.667871 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.667944 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668254 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-logs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.668282 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770448 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-logs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770515 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2c5\" (UniqueName: \"kubernetes.io/projected/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-kube-api-access-7l2c5\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770548 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770580 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770629 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.770674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.771671 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-logs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.772280 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.775435 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.776156 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1810628263decbcb8d9790a46f0a2a80fe37ecdd6e2a4c05137bd112c0de5f67/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.777262 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-scripts\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.782882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-config-data\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.790658 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2c5\" (UniqueName: \"kubernetes.io/projected/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-kube-api-access-7l2c5\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.792847 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.806675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7cdbc1c-79cc-441b-a08c-c61b717d82c9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:44 crc kubenswrapper[4898]: I0313 14:22:44.856774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-17b3b094-1a55-406a-a787-e0abb588e5b7\") pod \"glance-default-external-api-0\" (UID: \"a7cdbc1c-79cc-441b-a08c-c61b717d82c9\") " pod="openstack/glance-default-external-api-0" Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.137636 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.428192 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerID="58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f" exitCode=0 Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.428542 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerDied","Data":"58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f"} Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.444523 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624"} Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.447371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerStarted","Data":"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad"} Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.762462 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f772c247-f65b-4185-9c75-25d5894ada70" path="/var/lib/kubelet/pods/f772c247-f65b-4185-9c75-25d5894ada70/volumes" Mar 13 14:22:45 crc kubenswrapper[4898]: I0313 14:22:45.923662 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.013081 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.013417 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.013449 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.014623 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.024256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8" (OuterVolumeSpecName: "kube-api-access-n9fs8") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "kube-api-access-n9fs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042527 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042594 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042648 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042697 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.042834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") pod \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\" (UID: \"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21\") " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.044031 4898 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.044049 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fs8\" (UniqueName: \"kubernetes.io/projected/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-kube-api-access-n9fs8\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.044327 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs" (OuterVolumeSpecName: "logs") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.063217 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.082096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts" (OuterVolumeSpecName: "scripts") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.146263 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.146291 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.177943 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (OuterVolumeSpecName: "glance") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.180943 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.193009 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data" (OuterVolumeSpecName: "config-data") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.195286 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" (UID: "a8312dc9-a2b4-4ee6-b34f-cb984c14ad21"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248497 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" " Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248536 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248547 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.248558 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.291920 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.292069 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3") on node "crc" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.351156 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.461678 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a8312dc9-a2b4-4ee6-b34f-cb984c14ad21","Type":"ContainerDied","Data":"8cae8ab663d1286eb25519e52a961629328c7b2280f2d0209ef1752767b57b37"} Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.461718 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.461762 4898 scope.go:117] "RemoveContainer" containerID="58217feb23785534a405127b1afdf5fb04709d2e86145617a9cb9b01bf24630f" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.464249 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7cdbc1c-79cc-441b-a08c-c61b717d82c9","Type":"ContainerStarted","Data":"a3bce51beac47756082a84fa10e108e6da1b8f25f5395f721d74dd749bc49b65"} Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.468170 4898 generic.go:334] "Generic (PLEG): container finished" podID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" exitCode=0 Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.468221 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad"} Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.520433 4898 scope.go:117] "RemoveContainer" containerID="e53933d3c42586f8c8f9ea54060e1af27f64d65b3366e751904358fe342bcf4d" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.576474 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.576527 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.576542 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: E0313 14:22:46.577066 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577080 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" Mar 13 14:22:46 crc kubenswrapper[4898]: E0313 14:22:46.577096 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577104 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577381 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.577412 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.578683 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.590201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.599347 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.599539 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701193 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701270 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701388 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701436 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701620 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701683 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.701730 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qz6d\" (UniqueName: \"kubernetes.io/projected/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-kube-api-access-8qz6d\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.803993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804124 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qz6d\" (UniqueName: \"kubernetes.io/projected/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-kube-api-access-8qz6d\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804213 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804315 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804371 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.804427 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.806078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.811212 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.813773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.814983 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.815567 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.815590 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e32bfaa798492a506ddfd6dd81603c6b252f4a286c98ba8256226389647f45c3/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.831642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.832186 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qz6d\" (UniqueName: \"kubernetes.io/projected/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-kube-api-access-8qz6d\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.833036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f666d519-2c39-4e93-823d-e5a3fcfd0d5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.901450 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-383f3fb8-35dc-45bb-8ddf-48d4bf238bf3\") pod \"glance-default-internal-api-0\" (UID: \"f666d519-2c39-4e93-823d-e5a3fcfd0d5a\") " pod="openstack/glance-default-internal-api-0" Mar 13 14:22:46 crc kubenswrapper[4898]: I0313 14:22:46.923441 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.502862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerStarted","Data":"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.515480 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.515825 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.529113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7cdbc1c-79cc-441b-a08c-c61b717d82c9","Type":"ContainerStarted","Data":"4265dd0507c2704781116d72355561e591bf0123855e74ed893f542adfdb719b"} Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.535659 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sgxj9" podStartSLOduration=4.85974153 podStartE2EDuration="8.535641134s" podCreationTimestamp="2026-03-13 14:22:39 +0000 UTC" firstStartedPulling="2026-03-13 14:22:43.336176708 +0000 UTC m=+1598.337764947" lastFinishedPulling="2026-03-13 14:22:47.012076322 +0000 UTC m=+1602.013664551" observedRunningTime="2026-03-13 14:22:47.525132131 +0000 UTC m=+1602.526720370" watchObservedRunningTime="2026-03-13 14:22:47.535641134 +0000 UTC m=+1602.537229373" Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.607590 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 14:22:47 crc kubenswrapper[4898]: I0313 14:22:47.752546 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" path="/var/lib/kubelet/pods/a8312dc9-a2b4-4ee6-b34f-cb984c14ad21/volumes" Mar 13 14:22:48 crc kubenswrapper[4898]: I0313 14:22:48.548854 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a7cdbc1c-79cc-441b-a08c-c61b717d82c9","Type":"ContainerStarted","Data":"66c8a09078700facbed2482f14fad615c3281038c45bea626f2eb1f6fe913934"} Mar 13 14:22:48 crc kubenswrapper[4898]: I0313 14:22:48.561599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f666d519-2c39-4e93-823d-e5a3fcfd0d5a","Type":"ContainerStarted","Data":"c71998f7e85a12ae7fd0bfb2e9d6340902dbc3212be784e3b0a7b3bb1eb85daa"} Mar 13 14:22:48 crc kubenswrapper[4898]: I0313 14:22:48.561650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f666d519-2c39-4e93-823d-e5a3fcfd0d5a","Type":"ContainerStarted","Data":"d23acfbc3ba71b9594b68688692fbc4ddeaf3360f98c7f712571a80019e0d00c"} Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.572660 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerStarted","Data":"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d"} Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573255 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573255 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" containerID="cri-o://fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573294 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" containerID="cri-o://322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573270 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" containerID="cri-o://b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.573415 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" containerID="cri-o://3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" gracePeriod=30 Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.576039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f666d519-2c39-4e93-823d-e5a3fcfd0d5a","Type":"ContainerStarted","Data":"edd7f8a46e26c03cdc0e5e2c01acb1a698e0cea2d01ef6aedd55274949e181d8"} Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.593332 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.593310449 podStartE2EDuration="5.593310449s" podCreationTimestamp="2026-03-13 14:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:48.574872012 +0000 UTC m=+1603.576460261" watchObservedRunningTime="2026-03-13 14:22:49.593310449 +0000 UTC m=+1604.594898688" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.654390 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.654371325 podStartE2EDuration="3.654371325s" podCreationTimestamp="2026-03-13 14:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:22:49.625470734 +0000 UTC m=+1604.627058993" watchObservedRunningTime="2026-03-13 14:22:49.654371325 +0000 UTC m=+1604.655959564" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.671442 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.714264 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.042093541 podStartE2EDuration="6.714246509s" podCreationTimestamp="2026-03-13 14:22:43 +0000 UTC" firstStartedPulling="2026-03-13 14:22:44.335485089 +0000 UTC m=+1599.337073328" lastFinishedPulling="2026-03-13 14:22:49.007638057 +0000 UTC m=+1604.009226296" observedRunningTime="2026-03-13 14:22:49.64919448 +0000 UTC m=+1604.650782739" watchObservedRunningTime="2026-03-13 14:22:49.714246509 +0000 UTC m=+1604.715834748" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.817017 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.860990 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.861240 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:49 crc kubenswrapper[4898]: I0313 14:22:49.924432 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.589922 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" exitCode=0 Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.590223 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" exitCode=2 Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.590236 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" exitCode=0 Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.591365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d"} Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.591410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de"} Mar 13 14:22:50 crc kubenswrapper[4898]: I0313 14:22:50.591427 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00"} Mar 13 14:22:51 crc kubenswrapper[4898]: I0313 14:22:51.603075 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7ldc" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" containerID="cri-o://3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a" gracePeriod=2 Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.614668 4898 generic.go:334] "Generic (PLEG): container finished" podID="b38f3681-6f2f-437f-9694-810d43921aa2" containerID="3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a" exitCode=0 Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.616029 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a"} Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.782760 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.863734 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") pod \"b38f3681-6f2f-437f-9694-810d43921aa2\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.863812 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") pod \"b38f3681-6f2f-437f-9694-810d43921aa2\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.864180 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") pod \"b38f3681-6f2f-437f-9694-810d43921aa2\" (UID: \"b38f3681-6f2f-437f-9694-810d43921aa2\") " Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.864922 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities" (OuterVolumeSpecName: "utilities") pod "b38f3681-6f2f-437f-9694-810d43921aa2" (UID: "b38f3681-6f2f-437f-9694-810d43921aa2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.881601 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469" (OuterVolumeSpecName: "kube-api-access-v2469") pod "b38f3681-6f2f-437f-9694-810d43921aa2" (UID: "b38f3681-6f2f-437f-9694-810d43921aa2"). InnerVolumeSpecName "kube-api-access-v2469". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.967367 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2469\" (UniqueName: \"kubernetes.io/projected/b38f3681-6f2f-437f-9694-810d43921aa2-kube-api-access-v2469\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.967401 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:52 crc kubenswrapper[4898]: I0313 14:22:52.989019 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b38f3681-6f2f-437f-9694-810d43921aa2" (UID: "b38f3681-6f2f-437f-9694-810d43921aa2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.071756 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b38f3681-6f2f-437f-9694-810d43921aa2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.628948 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7ldc" event={"ID":"b38f3681-6f2f-437f-9694-810d43921aa2","Type":"ContainerDied","Data":"14f6fca3b948b0af4024656eeff03f1e8c8fe427562c97293c95f1bdd3284d24"} Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.629056 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7ldc" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.630417 4898 scope.go:117] "RemoveContainer" containerID="3cb8ede7b2e1e9a6a6b4976b023c84903fe921e5d4e530d62aef69fe59b03a0a" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.663140 4898 scope.go:117] "RemoveContainer" containerID="c5557297cdf8c10622794d499e8dd04fb952d7400a53b9ebeaf103b83d901e50" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.675614 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.686435 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7ldc"] Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.708824 4898 scope.go:117] "RemoveContainer" containerID="d6ed263f1fe660123646c8c6128f780dbe747c9b3a543fa08475d3acfc1517d5" Mar 13 14:22:53 crc kubenswrapper[4898]: I0313 14:22:53.769102 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" path="/var/lib/kubelet/pods/b38f3681-6f2f-437f-9694-810d43921aa2/volumes" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.137867 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.138197 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.185628 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.224175 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.652142 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.652187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 14:22:55 crc kubenswrapper[4898]: I0313 14:22:55.752549 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:22:55 crc kubenswrapper[4898]: E0313 14:22:55.752836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.668871 4898 generic.go:334] "Generic (PLEG): container finished" podID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerID="d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371" exitCode=0 Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.668935 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerDied","Data":"d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371"} Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.924530 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.924598 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.973396 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:56 crc kubenswrapper[4898]: I0313 14:22:56.976562 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:57 crc kubenswrapper[4898]: I0313 14:22:57.681719 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:57 crc kubenswrapper[4898]: I0313 14:22:57.682046 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.214937 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.311593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.311912 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.311985 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.312136 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") pod \"7eba407c-68a5-45e9-ab51-e8cba05d8559\" (UID: \"7eba407c-68a5-45e9-ab51-e8cba05d8559\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.318078 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts" (OuterVolumeSpecName: "scripts") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.381084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h" (OuterVolumeSpecName: "kube-api-access-x8c7h") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "kube-api-access-x8c7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.402976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data" (OuterVolumeSpecName: "config-data") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.418280 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8c7h\" (UniqueName: \"kubernetes.io/projected/7eba407c-68a5-45e9-ab51-e8cba05d8559-kube-api-access-x8c7h\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.418319 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.418332 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.439933 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eba407c-68a5-45e9-ab51-e8cba05d8559" (UID: "7eba407c-68a5-45e9-ab51-e8cba05d8559"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.520756 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eba407c-68a5-45e9-ab51-e8cba05d8559-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.579141 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694576 4898 generic.go:334] "Generic (PLEG): container finished" podID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" exitCode=0 Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694638 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624"} Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694665 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"37ab1f60-9ee0-4d70-9730-f17c9feafaeb","Type":"ContainerDied","Data":"bdcbd848029858f4c387dbe27ce9e5d245b65833296202afd31da1583743aa4c"} Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694684 4898 scope.go:117] "RemoveContainer" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.694860 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.696582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qps2v" event={"ID":"7eba407c-68a5-45e9-ab51-e8cba05d8559","Type":"ContainerDied","Data":"ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d"} Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.696610 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef0f6ea172f6bf26e4477b27da206d5a7254b564e5190d1ac27a9d0f8d70d00d" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.696644 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qps2v" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.722192 4898 scope.go:117] "RemoveContainer" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726110 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726406 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726529 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726599 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.726715 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.727696 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.727798 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") pod \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\" (UID: \"37ab1f60-9ee0-4d70-9730-f17c9feafaeb\") " Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.730021 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.735753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.746156 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts" (OuterVolumeSpecName: "scripts") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.752467 4898 scope.go:117] "RemoveContainer" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.759952 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw" (OuterVolumeSpecName: "kube-api-access-674qw") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "kube-api-access-674qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.782965 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.795923 4898 scope.go:117] "RemoveContainer" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810012 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810553 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810565 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810587 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerName="nova-cell0-conductor-db-sync" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810596 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerName="nova-cell0-conductor-db-sync" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810612 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-utilities" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810619 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-utilities" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810635 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810641 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810657 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810664 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810677 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810682 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810695 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-content" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810701 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="extract-content" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.810722 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810727 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810951 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="proxy-httpd" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810974 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="sg-core" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810980 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-notification-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810986 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" containerName="ceilometer-central-agent" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.810997 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b38f3681-6f2f-437f-9694-810d43921aa2" containerName="registry-server" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.811013 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" containerName="nova-cell0-conductor-db-sync" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.811839 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.814782 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.815061 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-42bds" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.821920 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832157 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832181 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832191 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832199 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.832207 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674qw\" (UniqueName: \"kubernetes.io/projected/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-kube-api-access-674qw\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.842544 4898 scope.go:117] "RemoveContainer" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.843229 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d\": container with ID starting with 322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d not found: ID does not exist" containerID="322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843275 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d"} err="failed to get container status \"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d\": rpc error: code = NotFound desc = could not find container \"322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d\": container with ID starting with 322d94f85c75d789289c53ccffadace35b38b23785b40dc3303f2046c979a20d not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843301 4898 scope.go:117] "RemoveContainer" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.843525 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de\": container with ID starting with 3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de not found: ID does not exist" containerID="3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843548 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de"} err="failed to get container status \"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de\": rpc error: code = NotFound desc = could not find container \"3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de\": container with ID starting with 3952d121d2f30ecb7494684374560553b6245f9837c68c02b65c84ff2b6971de not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843561 4898 scope.go:117] "RemoveContainer" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.843735 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00\": container with ID starting with b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00 not found: ID does not exist" containerID="b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843759 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00"} err="failed to get container status \"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00\": rpc error: code = NotFound desc = could not find container \"b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00\": container with ID starting with b8d3215be81e811287d32ec5a152d8a9105ea0bd86eb83c702bc5a2d79713e00 not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.843777 4898 scope.go:117] "RemoveContainer" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" Mar 13 14:22:58 crc kubenswrapper[4898]: E0313 14:22:58.844009 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624\": container with ID starting with fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624 not found: ID does not exist" containerID="fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.844034 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624"} err="failed to get container status \"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624\": rpc error: code = NotFound desc = could not find container \"fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624\": container with ID starting with fbf266f2bbef4b4dd4d9d82590e547f43077a49d8aea2c8e5166465ede160624 not found: ID does not exist" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.848976 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.894640 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data" (OuterVolumeSpecName: "config-data") pod "37ab1f60-9ee0-4d70-9730-f17c9feafaeb" (UID: "37ab1f60-9ee0-4d70-9730-f17c9feafaeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.933622 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbnf9\" (UniqueName: \"kubernetes.io/projected/9796fb40-37f0-4d8a-929f-4bb6295388a4-kube-api-access-tbnf9\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.933823 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.933859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.934439 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:58 crc kubenswrapper[4898]: I0313 14:22:58.934524 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37ab1f60-9ee0-4d70-9730-f17c9feafaeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.027913 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.037014 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbnf9\" (UniqueName: \"kubernetes.io/projected/9796fb40-37f0-4d8a-929f-4bb6295388a4-kube-api-access-tbnf9\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.037217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.037248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.038802 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.049719 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.049841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9796fb40-37f0-4d8a-929f-4bb6295388a4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.059580 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.062617 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.064963 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.068448 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.070683 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbnf9\" (UniqueName: \"kubernetes.io/projected/9796fb40-37f0-4d8a-929f-4bb6295388a4-kube-api-access-tbnf9\") pod \"nova-cell0-conductor-0\" (UID: \"9796fb40-37f0-4d8a-929f-4bb6295388a4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.103195 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.103295 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.106199 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.144340 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.158176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.276356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.276706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.277017 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.291139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.291264 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.291310 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.292873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.398829 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399252 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399291 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399380 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399426 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.399959 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.410240 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.410761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.412910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.421795 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.429098 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.436563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.641292 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.699852 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 14:22:59 crc kubenswrapper[4898]: W0313 14:22:59.710656 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9796fb40_37f0_4d8a_929f_4bb6295388a4.slice/crio-ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba WatchSource:0}: Error finding container ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba: Status 404 returned error can't find the container with id ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.715276 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.715300 4898 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.758586 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ab1f60-9ee0-4d70-9730-f17c9feafaeb" path="/var/lib/kubelet/pods/37ab1f60-9ee0-4d70-9730-f17c9feafaeb/volumes" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.934246 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:22:59 crc kubenswrapper[4898]: I0313 14:22:59.996341 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.129928 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:00 crc kubenswrapper[4898]: W0313 14:23:00.138376 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode179d9f8_0775_4bde_9ac9_f5a4f6919fd6.slice/crio-5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29 WatchSource:0}: Error finding container 5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29: Status 404 returned error can't find the container with id 5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29 Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.195792 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.197763 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.727518 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29"} Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.729524 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9796fb40-37f0-4d8a-929f-4bb6295388a4","Type":"ContainerStarted","Data":"a9557891418a97a02368405ccaf040fbd8afdac6c0e21aa1dbdad6d66c32db14"} Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.729600 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9796fb40-37f0-4d8a-929f-4bb6295388a4","Type":"ContainerStarted","Data":"ee07bf8d012fcf6aefa9d877ab61e51a9ab17008f8e280eddbbc403dcc107dba"} Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.730109 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sgxj9" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" containerID="cri-o://6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" gracePeriod=2 Mar 13 14:23:00 crc kubenswrapper[4898]: I0313 14:23:00.771331 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.771314816 podStartE2EDuration="2.771314816s" podCreationTimestamp="2026-03-13 14:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:00.759741915 +0000 UTC m=+1615.761330164" watchObservedRunningTime="2026-03-13 14:23:00.771314816 +0000 UTC m=+1615.772903045" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.332225 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.461287 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") pod \"df5a6baa-ea65-4b79-b73b-2e1707695c41\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.461703 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") pod \"df5a6baa-ea65-4b79-b73b-2e1707695c41\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.461753 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") pod \"df5a6baa-ea65-4b79-b73b-2e1707695c41\" (UID: \"df5a6baa-ea65-4b79-b73b-2e1707695c41\") " Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.463016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities" (OuterVolumeSpecName: "utilities") pod "df5a6baa-ea65-4b79-b73b-2e1707695c41" (UID: "df5a6baa-ea65-4b79-b73b-2e1707695c41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.491267 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq" (OuterVolumeSpecName: "kube-api-access-8t8zq") pod "df5a6baa-ea65-4b79-b73b-2e1707695c41" (UID: "df5a6baa-ea65-4b79-b73b-2e1707695c41"). InnerVolumeSpecName "kube-api-access-8t8zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.513726 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df5a6baa-ea65-4b79-b73b-2e1707695c41" (UID: "df5a6baa-ea65-4b79-b73b-2e1707695c41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.564582 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t8zq\" (UniqueName: \"kubernetes.io/projected/df5a6baa-ea65-4b79-b73b-2e1707695c41-kube-api-access-8t8zq\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.564628 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.564642 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df5a6baa-ea65-4b79-b73b-2e1707695c41-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.759595 4898 generic.go:334] "Generic (PLEG): container finished" podID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" exitCode=0 Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.760636 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sgxj9" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773114 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773145 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47"} Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773163 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38"} Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773182 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sgxj9" event={"ID":"df5a6baa-ea65-4b79-b73b-2e1707695c41","Type":"ContainerDied","Data":"c4f29c1c84db31109e4a65fbd939345ced5fa3e54cbf69234fc3c9374e91653f"} Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.773199 4898 scope.go:117] "RemoveContainer" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.835255 4898 scope.go:117] "RemoveContainer" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.841489 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.851841 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sgxj9"] Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.870512 4898 scope.go:117] "RemoveContainer" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.890365 4898 scope.go:117] "RemoveContainer" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" Mar 13 14:23:01 crc kubenswrapper[4898]: E0313 14:23:01.891937 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38\": container with ID starting with 6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38 not found: ID does not exist" containerID="6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.891972 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38"} err="failed to get container status \"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38\": rpc error: code = NotFound desc = could not find container \"6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38\": container with ID starting with 6ec8603c187d66f48cc10e4727d771d9b7fbf021f18024c4db05bf6b399eed38 not found: ID does not exist" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.891997 4898 scope.go:117] "RemoveContainer" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" Mar 13 14:23:01 crc kubenswrapper[4898]: E0313 14:23:01.892335 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad\": container with ID starting with dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad not found: ID does not exist" containerID="dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.892357 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad"} err="failed to get container status \"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad\": rpc error: code = NotFound desc = could not find container \"dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad\": container with ID starting with dbe94dc617bb9f1ceb20f3a1398ada0e60107b89371dd59bf5d3b153a002dfad not found: ID does not exist" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.892370 4898 scope.go:117] "RemoveContainer" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" Mar 13 14:23:01 crc kubenswrapper[4898]: E0313 14:23:01.892635 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606\": container with ID starting with d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606 not found: ID does not exist" containerID="d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606" Mar 13 14:23:01 crc kubenswrapper[4898]: I0313 14:23:01.892656 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606"} err="failed to get container status \"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606\": rpc error: code = NotFound desc = could not find container \"d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606\": container with ID starting with d9479d77fd8402baf87c3755eb73b852dad59c425ece2529270d3dc4ba5a1606 not found: ID does not exist" Mar 13 14:23:02 crc kubenswrapper[4898]: I0313 14:23:02.785989 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde"} Mar 13 14:23:02 crc kubenswrapper[4898]: I0313 14:23:02.786279 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49"} Mar 13 14:23:03 crc kubenswrapper[4898]: I0313 14:23:03.775790 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" path="/var/lib/kubelet/pods/df5a6baa-ea65-4b79-b73b-2e1707695c41/volumes" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.184869 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.847699 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:23:04 crc kubenswrapper[4898]: E0313 14:23:04.849086 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-utilities" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849111 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-utilities" Mar 13 14:23:04 crc kubenswrapper[4898]: E0313 14:23:04.849165 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-content" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849174 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="extract-content" Mar 13 14:23:04 crc kubenswrapper[4898]: E0313 14:23:04.849207 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849215 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.849512 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5a6baa-ea65-4b79-b73b-2e1707695c41" containerName="registry-server" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.850559 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.853938 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.860801 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.866829 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.962958 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.963084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.963130 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:04 crc kubenswrapper[4898]: I0313 14:23:04.963257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.029775 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.037176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.046375 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.067139 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.069480 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.069778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.075749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.081571 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.078752 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.127844 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.147639 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.161470 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"nova-cell0-cell-mapping-76lrv\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.174518 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.184218 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.192102 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.200245 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.201011 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.201064 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.211854 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.414603 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.414744 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415096 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415267 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.415750 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.430667 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.448815 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.450808 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.460839 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.480069 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.488523 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"nova-cell1-novncproxy-0\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526030 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526271 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526358 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526487 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526630 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526741 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.526874 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.527020 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.533729 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.540087 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.545661 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.553249 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"nova-metadata-0\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.579802 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.616329 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.621792 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.629974 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.630759 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.631777 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.632207 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.632365 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.637238 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.660791 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.663284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"nova-scheduler-0\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.668144 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.670480 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.681124 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.689432 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.724622 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.737143 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740070 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740331 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740406 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.740806 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.801825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.845916 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846421 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846744 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846809 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.846881 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.853662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.854196 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.856826 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.856859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.857380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.897163 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"dnsmasq-dns-9b86998b5-4rc67\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.905160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerStarted","Data":"e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a"} Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.905773 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.963635 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.964742 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.964961 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.965855 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.966563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.973832 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.195543427 podStartE2EDuration="6.97381093s" podCreationTimestamp="2026-03-13 14:22:59 +0000 UTC" firstStartedPulling="2026-03-13 14:23:00.141377112 +0000 UTC m=+1615.142965351" lastFinishedPulling="2026-03-13 14:23:04.919644615 +0000 UTC m=+1619.921232854" observedRunningTime="2026-03-13 14:23:05.958876653 +0000 UTC m=+1620.960464912" watchObservedRunningTime="2026-03-13 14:23:05.97381093 +0000 UTC m=+1620.975399169" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.977357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.980189 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:05 crc kubenswrapper[4898]: I0313 14:23:05.982833 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.000777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"nova-api-0\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " pod="openstack/nova-api-0" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.024853 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.153305 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.495331 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.743094 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.752408 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:06 crc kubenswrapper[4898]: E0313 14:23:06.760437 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.782354 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.824504 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.828435 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.843405 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.843790 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.843936 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.928950 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerStarted","Data":"fbba396a6b51b627bcb4364e72eecb72dbdb1d0ee7df91ec4ad791d87ce837cc"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.933565 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerStarted","Data":"74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.934937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerStarted","Data":"8a577eea2ae5147af3f287388f18da3286a949f40aa4ca2fbd11e7f3d483f618"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.937864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerStarted","Data":"8afa6be1221f1010388ebd2f7d552a490a868c166c3d8a6eee8fcc24b9095e1d"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.940443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerStarted","Data":"7a4153aaa3cfe90175a3e8e41cb9a73f5056ef4defe81a11d1b496b7bcdcc9b6"} Mar 13 14:23:06 crc kubenswrapper[4898]: I0313 14:23:06.978885 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-76lrv" podStartSLOduration=2.978854781 podStartE2EDuration="2.978854781s" podCreationTimestamp="2026-03-13 14:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:06.95953035 +0000 UTC m=+1621.961118619" watchObservedRunningTime="2026-03-13 14:23:06.978854781 +0000 UTC m=+1621.980443020" Mar 13 14:23:07 crc kubenswrapper[4898]: W0313 14:23:07.016688 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee82d4ec_b565_40b8_b878_2574487d7e9d.slice/crio-7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425 WatchSource:0}: Error finding container 7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425: Status 404 returned error can't find the container with id 7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425 Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.025475 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.025662 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.025725 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.026058 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: W0313 14:23:07.060875 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod401e6738_93d7_40d4_867e_8c68437cbad3.slice/crio-33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696 WatchSource:0}: Error finding container 33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696: Status 404 returned error can't find the container with id 33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696 Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.074536 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.092547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.133416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.136264 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.139956 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.140225 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.140687 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.140948 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.149231 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.164474 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"nova-cell1-conductor-db-sync-llbn5\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.176345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.732264 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.970936 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerStarted","Data":"bf25202e858a5aee252a5d091c6efe401cd4b61d92f156b22eb7c6a08833446f"} Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.972785 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerID="ceacbe8778fcc11e62876d98d598259e725fa8302adf85af1d1ddc9df4d62ff6" exitCode=0 Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.972958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerDied","Data":"ceacbe8778fcc11e62876d98d598259e725fa8302adf85af1d1ddc9df4d62ff6"} Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.973047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerStarted","Data":"7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425"} Mar 13 14:23:07 crc kubenswrapper[4898]: I0313 14:23:07.982032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerStarted","Data":"33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696"} Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.010505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerStarted","Data":"7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8"} Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.014917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerStarted","Data":"7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea"} Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.015351 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.050770 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-llbn5" podStartSLOduration=3.050742576 podStartE2EDuration="3.050742576s" podCreationTimestamp="2026-03-13 14:23:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:09.032934234 +0000 UTC m=+1624.034522483" watchObservedRunningTime="2026-03-13 14:23:09.050742576 +0000 UTC m=+1624.052330825" Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.085233 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" podStartSLOduration=4.085208771 podStartE2EDuration="4.085208771s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:09.058865767 +0000 UTC m=+1624.060454026" watchObservedRunningTime="2026-03-13 14:23:09.085208771 +0000 UTC m=+1624.086797010" Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.128549 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:09 crc kubenswrapper[4898]: I0313 14:23:09.152731 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.111571 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerStarted","Data":"b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.112293 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerStarted","Data":"4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.112452 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" containerID="cri-o://4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5" gracePeriod=30 Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.113027 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" containerID="cri-o://b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0" gracePeriod=30 Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.121638 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerStarted","Data":"a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.127265 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerStarted","Data":"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.127323 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" gracePeriod=30 Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.129098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerStarted","Data":"6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.129353 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerStarted","Data":"4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe"} Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.164216 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.972854795 podStartE2EDuration="8.164195859s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="2026-03-13 14:23:06.769016824 +0000 UTC m=+1621.770605063" lastFinishedPulling="2026-03-13 14:23:11.960357888 +0000 UTC m=+1626.961946127" observedRunningTime="2026-03-13 14:23:13.142525517 +0000 UTC m=+1628.144113766" watchObservedRunningTime="2026-03-13 14:23:13.164195859 +0000 UTC m=+1628.165784098" Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.175488 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.741658133 podStartE2EDuration="9.175464382s" podCreationTimestamp="2026-03-13 14:23:04 +0000 UTC" firstStartedPulling="2026-03-13 14:23:06.515592865 +0000 UTC m=+1621.517181104" lastFinishedPulling="2026-03-13 14:23:11.949399114 +0000 UTC m=+1626.950987353" observedRunningTime="2026-03-13 14:23:13.165838842 +0000 UTC m=+1628.167427091" watchObservedRunningTime="2026-03-13 14:23:13.175464382 +0000 UTC m=+1628.177052621" Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.195353 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.003821889 podStartE2EDuration="8.195335258s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="2026-03-13 14:23:06.757227278 +0000 UTC m=+1621.758815517" lastFinishedPulling="2026-03-13 14:23:11.948740647 +0000 UTC m=+1626.950328886" observedRunningTime="2026-03-13 14:23:13.192153365 +0000 UTC m=+1628.193741634" watchObservedRunningTime="2026-03-13 14:23:13.195335258 +0000 UTC m=+1628.196923497" Mar 13 14:23:13 crc kubenswrapper[4898]: I0313 14:23:13.248936 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.367682385 podStartE2EDuration="8.248912339s" podCreationTimestamp="2026-03-13 14:23:05 +0000 UTC" firstStartedPulling="2026-03-13 14:23:07.070136501 +0000 UTC m=+1622.071724740" lastFinishedPulling="2026-03-13 14:23:11.951366455 +0000 UTC m=+1626.952954694" observedRunningTime="2026-03-13 14:23:13.209110435 +0000 UTC m=+1628.210698684" watchObservedRunningTime="2026-03-13 14:23:13.248912339 +0000 UTC m=+1628.250500578" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145066 4898 generic.go:334] "Generic (PLEG): container finished" podID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerID="b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0" exitCode=0 Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145396 4898 generic.go:334] "Generic (PLEG): container finished" podID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerID="4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5" exitCode=143 Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145672 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerDied","Data":"b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0"} Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145728 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerDied","Data":"4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5"} Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145766 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fb29588-10df-4b49-a6a6-6a83ddada750","Type":"ContainerDied","Data":"fbba396a6b51b627bcb4364e72eecb72dbdb1d0ee7df91ec4ad791d87ce837cc"} Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.145779 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbba396a6b51b627bcb4364e72eecb72dbdb1d0ee7df91ec4ad791d87ce837cc" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.204835 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243327 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243433 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243461 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.243503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") pod \"3fb29588-10df-4b49-a6a6-6a83ddada750\" (UID: \"3fb29588-10df-4b49-a6a6-6a83ddada750\") " Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.250973 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs" (OuterVolumeSpecName: "logs") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.270141 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp" (OuterVolumeSpecName: "kube-api-access-dd5pp") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "kube-api-access-dd5pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.309422 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data" (OuterVolumeSpecName: "config-data") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.309444 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fb29588-10df-4b49-a6a6-6a83ddada750" (UID: "3fb29588-10df-4b49-a6a6-6a83ddada750"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347315 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd5pp\" (UniqueName: \"kubernetes.io/projected/3fb29588-10df-4b49-a6a6-6a83ddada750-kube-api-access-dd5pp\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347363 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347375 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb29588-10df-4b49-a6a6-6a83ddada750-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:14 crc kubenswrapper[4898]: I0313 14:23:14.347388 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fb29588-10df-4b49-a6a6-6a83ddada750-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.020916 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021543 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" containerID="cri-o://4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021661 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" containerID="cri-o://78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021700 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" containerID="cri-o://e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.021661 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" containerID="cri-o://e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a" gracePeriod=30 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.034749 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.164888 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde" exitCode=2 Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.164941 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde"} Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.165008 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.212426 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.230055 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.310367 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: E0313 14:23:15.314720 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.314751 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" Mar 13 14:23:15 crc kubenswrapper[4898]: E0313 14:23:15.314805 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.314812 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.320234 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-log" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.320295 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" containerName="nova-metadata-metadata" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.333783 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.341372 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.342704 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.358908 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477413 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477814 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.477927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.478090 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.581593 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.581914 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.581989 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.582055 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.582087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.582307 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.589836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.590800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.591479 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.603605 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"nova-metadata-0\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.634975 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.672027 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.200:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.672370 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="a8312dc9-a2b4-4ee6-b34f-cb984c14ad21" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.200:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.694800 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.785341 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb29588-10df-4b49-a6a6-6a83ddada750" path="/var/lib/kubelet/pods/3fb29588-10df-4b49-a6a6-6a83ddada750/volumes" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.807681 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.808115 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.865111 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 14:23:15 crc kubenswrapper[4898]: I0313 14:23:15.988187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.026164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.026200 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.090492 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.090772 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" containerID="cri-o://12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" gracePeriod=10 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256455 4898 generic.go:334] "Generic (PLEG): container finished" podID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerID="c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9" exitCode=137 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerDied","Data":"c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" event={"ID":"e53d1b61-e0c8-4c10-85bf-1c0f67009a24","Type":"ContainerDied","Data":"4e540b110f512e47246a7e1e8d332b0a3a04ba375a44ae46bbb913a91b51ab5a"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.256599 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e540b110f512e47246a7e1e8d332b0a3a04ba375a44ae46bbb913a91b51ab5a" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.278167 4898 generic.go:334] "Generic (PLEG): container finished" podID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerID="954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd" exitCode=137 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.278247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerDied","Data":"954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317399 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a" exitCode=0 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317425 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49" exitCode=0 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317433 4898 generic.go:334] "Generic (PLEG): container finished" podID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerID="4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47" exitCode=0 Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317849 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.317859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47"} Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.374160 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.402267 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539726 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539842 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539862 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.539981 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") pod \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\" (UID: \"e53d1b61-e0c8-4c10-85bf-1c0f67009a24\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.547082 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.547267 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6" (OuterVolumeSpecName: "kube-api-access-cg2l6") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "kube-api-access-cg2l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.612215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.628645 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data" (OuterVolumeSpecName: "config-data") pod "e53d1b61-e0c8-4c10-85bf-1c0f67009a24" (UID: "e53d1b61-e0c8-4c10-85bf-1c0f67009a24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643326 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643360 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643372 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.643382 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg2l6\" (UniqueName: \"kubernetes.io/projected/e53d1b61-e0c8-4c10-85bf-1c0f67009a24-kube-api-access-cg2l6\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.693857 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.849806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.849994 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.850054 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.850113 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") pod \"ad3d61d7-d777-4115-92c7-e4e3125c5260\" (UID: \"ad3d61d7-d777-4115-92c7-e4e3125c5260\") " Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.856783 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n" (OuterVolumeSpecName: "kube-api-access-bjv6n") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "kube-api-access-bjv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.857370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.928108 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.952937 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.952976 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjv6n\" (UniqueName: \"kubernetes.io/projected/ad3d61d7-d777-4115-92c7-e4e3125c5260-kube-api-access-bjv6n\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:16 crc kubenswrapper[4898]: I0313 14:23:16.952991 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.007157 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.033914 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data" (OuterVolumeSpecName: "config-data") pod "ad3d61d7-d777-4115-92c7-e4e3125c5260" (UID: "ad3d61d7-d777-4115-92c7-e4e3125c5260"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.055570 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad3d61d7-d777-4115-92c7-e4e3125c5260-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.110051 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.110288 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.253:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.159297 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271330 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271382 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271495 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271546 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271589 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.271624 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") pod \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\" (UID: \"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.286105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk" (OuterVolumeSpecName: "kube-api-access-zd2mk") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "kube-api-access-zd2mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.292970 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.389611 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerStarted","Data":"113e1d3b19705c253801494357ca106e72aa1f2de77f8992627258135742aa53"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.391688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9b6c99f6d-7zgm5" event={"ID":"ad3d61d7-d777-4115-92c7-e4e3125c5260","Type":"ContainerDied","Data":"9c7395afa41324a0f82874b3c28b6ce2289ed61f6812ec39a8b7176eb1dd6a99"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.391754 4898 scope.go:117] "RemoveContainer" containerID="954199ed87793fe013823cc99558f408a84d0a4c4745073ecc940b8eedc022cd" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.391943 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9b6c99f6d-7zgm5" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.404242 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd2mk\" (UniqueName: \"kubernetes.io/projected/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-kube-api-access-zd2mk\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.471527 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6","Type":"ContainerDied","Data":"5815a6d36d69a2209375bea643fadce77792eface8bf7ea1c4a47338f6cf2f29"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.471682 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.493468 4898 generic.go:334] "Generic (PLEG): container finished" podID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" exitCode=0 Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.494617 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.495359 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerDied","Data":"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.495391 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xntfr" event={"ID":"08964a7d-6cae-4d8d-8dc7-8828bb55c6b6","Type":"ContainerDied","Data":"c39986f2a0c08da0dd84aef4031cbde75ea3fecd89eca753618403b386b96b49"} Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.495457 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-759d64ffd4-kzp67" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.510957 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.511083 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.511144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.511593 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513187 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") pod \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\" (UID: \"e179d9f8-0775-4bde-9ac9-f5a4f6919fd6\") " Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.513646 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.514213 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.585555 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts" (OuterVolumeSpecName: "scripts") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.618639 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.618683 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.619370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl" (OuterVolumeSpecName: "kube-api-access-bsfjl") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "kube-api-access-bsfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.726473 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsfjl\" (UniqueName: \"kubernetes.io/projected/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-kube-api-access-bsfjl\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.774569 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.793104 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.818918 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.831858 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.831884 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.832035 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.856090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.875297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config" (OuterVolumeSpecName: "config") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.939187 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.939229 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:17 crc kubenswrapper[4898]: I0313 14:23:17.940559 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" (UID: "08964a7d-6cae-4d8d-8dc7-8828bb55c6b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.001820 4898 scope.go:117] "RemoveContainer" containerID="e1d215889c17e15d0414497f1d056c6cbb6a9ad029cf99e79cd2f78d36843f9a" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.021052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.031999 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.046092 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.046123 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.049090 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-9b6c99f6d-7zgm5"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.058072 4898 scope.go:117] "RemoveContainer" containerID="78de73e116b8c904046e3b2e47974a2196e68f801e439ea9db14e62605f5dfde" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.071000 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.079834 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-759d64ffd4-kzp67"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.088091 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data" (OuterVolumeSpecName: "config-data") pod "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" (UID: "e179d9f8-0775-4bde-9ac9-f5a4f6919fd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.124053 4898 scope.go:117] "RemoveContainer" containerID="e7eb04ad73d23fba07486739b6ad6f87229d6c54e9915ba0f9d02034ddc99c49" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.147881 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.155357 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.165878 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xntfr"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.175710 4898 scope.go:117] "RemoveContainer" containerID="4bb36715efc02b13f9a2428d735fb39dbbe5a666eac1302af323de3923739e47" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.266650 4898 scope.go:117] "RemoveContainer" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.298136 4898 scope.go:117] "RemoveContainer" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.366196 4898 scope.go:117] "RemoveContainer" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.366753 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a\": container with ID starting with 12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a not found: ID does not exist" containerID="12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.366830 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a"} err="failed to get container status \"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a\": rpc error: code = NotFound desc = could not find container \"12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a\": container with ID starting with 12cbc4c8fa51ad14fc2097d92198ad03764ae0a7349dafd16f0ac9e66b1aa21a not found: ID does not exist" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.366865 4898 scope.go:117] "RemoveContainer" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.367546 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61\": container with ID starting with 1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61 not found: ID does not exist" containerID="1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.367575 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61"} err="failed to get container status \"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61\": rpc error: code = NotFound desc = could not find container \"1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61\": container with ID starting with 1580c7298d78acec59a4cdc1605e81e7b25fff7c075f4562e2ffb1fcb708ba61 not found: ID does not exist" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.447826 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.466585 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.485866 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486474 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486490 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486512 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486519 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486535 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486540 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486552 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="init" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486558 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="init" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486566 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486573 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486603 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486609 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486616 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486622 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" Mar 13 14:23:18 crc kubenswrapper[4898]: E0313 14:23:18.486631 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486638 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486833 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-notification-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486856 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="proxy-httpd" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486884 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" containerName="heat-cfnapi" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486916 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="ceilometer-central-agent" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486929 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" containerName="heat-api" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486944 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" containerName="dnsmasq-dns" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.486967 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" containerName="sg-core" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.489285 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.493196 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.493451 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.501582 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.534499 4898 generic.go:334] "Generic (PLEG): container finished" podID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerID="7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8" exitCode=0 Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.534606 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerDied","Data":"7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.554295 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerStarted","Data":"ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.554358 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerStarted","Data":"ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.557584 4898 generic.go:334] "Generic (PLEG): container finished" podID="04183e35-79b0-4c76-b538-b5b71299cd92" containerID="74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506" exitCode=0 Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.558141 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerDied","Data":"74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506"} Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560640 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560733 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.560990 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.561050 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.561115 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.578036 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.578013359 podStartE2EDuration="3.578013359s" podCreationTimestamp="2026-03-13 14:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:18.57384107 +0000 UTC m=+1633.575429309" watchObservedRunningTime="2026-03-13 14:23:18.578013359 +0000 UTC m=+1633.579601588" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.664528 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667199 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667355 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667639 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667714 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.667765 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.668010 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.668705 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.671862 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.672991 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.674282 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.675170 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.688478 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.688758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"ceilometer-0\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " pod="openstack/ceilometer-0" Mar 13 14:23:18 crc kubenswrapper[4898]: I0313 14:23:18.808723 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.424707 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.568609 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"1e1a0e67a834a696d965eab6f5a0ade85145365bbc3c3bccceb0280541a0e5ed"} Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.761555 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08964a7d-6cae-4d8d-8dc7-8828bb55c6b6" path="/var/lib/kubelet/pods/08964a7d-6cae-4d8d-8dc7-8828bb55c6b6/volumes" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.762611 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad3d61d7-d777-4115-92c7-e4e3125c5260" path="/var/lib/kubelet/pods/ad3d61d7-d777-4115-92c7-e4e3125c5260/volumes" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.763254 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e179d9f8-0775-4bde-9ac9-f5a4f6919fd6" path="/var/lib/kubelet/pods/e179d9f8-0775-4bde-9ac9-f5a4f6919fd6/volumes" Mar 13 14:23:19 crc kubenswrapper[4898]: I0313 14:23:19.764526 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e53d1b61-e0c8-4c10-85bf-1c0f67009a24" path="/var/lib/kubelet/pods/e53d1b61-e0c8-4c10-85bf-1c0f67009a24/volumes" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.159161 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.239399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.239663 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.239859 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.240078 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") pod \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\" (UID: \"bff908e4-09f4-490b-9b9c-ef65c6224eeb\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.269277 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd" (OuterVolumeSpecName: "kube-api-access-nfnmd") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "kube-api-access-nfnmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.277362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts" (OuterVolumeSpecName: "scripts") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.343412 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfnmd\" (UniqueName: \"kubernetes.io/projected/bff908e4-09f4-490b-9b9c-ef65c6224eeb-kube-api-access-nfnmd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.343624 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.395457 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.396219 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data" (OuterVolumeSpecName: "config-data") pod "bff908e4-09f4-490b-9b9c-ef65c6224eeb" (UID: "bff908e4-09f4-490b-9b9c-ef65c6224eeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.445765 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.446778 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bff908e4-09f4-490b-9b9c-ef65c6224eeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.517129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549275 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549536 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549712 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.549967 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") pod \"04183e35-79b0-4c76-b538-b5b71299cd92\" (UID: \"04183e35-79b0-4c76-b538-b5b71299cd92\") " Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.554492 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts" (OuterVolumeSpecName: "scripts") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.558190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m" (OuterVolumeSpecName: "kube-api-access-92r4m") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "kube-api-access-92r4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.614082 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data" (OuterVolumeSpecName: "config-data") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.614673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04183e35-79b0-4c76-b538-b5b71299cd92" (UID: "04183e35-79b0-4c76-b538-b5b71299cd92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.617724 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-76lrv" event={"ID":"04183e35-79b0-4c76-b538-b5b71299cd92","Type":"ContainerDied","Data":"8a577eea2ae5147af3f287388f18da3286a949f40aa4ca2fbd11e7f3d483f618"} Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.617960 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a577eea2ae5147af3f287388f18da3286a949f40aa4ca2fbd11e7f3d483f618" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.618134 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-76lrv" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.632970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-llbn5" event={"ID":"bff908e4-09f4-490b-9b9c-ef65c6224eeb","Type":"ContainerDied","Data":"bf25202e858a5aee252a5d091c6efe401cd4b61d92f156b22eb7c6a08833446f"} Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.633043 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf25202e858a5aee252a5d091c6efe401cd4b61d92f156b22eb7c6a08833446f" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.633153 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-llbn5" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.644205 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: E0313 14:23:20.644948 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" containerName="nova-manage" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645039 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" containerName="nova-manage" Mar 13 14:23:20 crc kubenswrapper[4898]: E0313 14:23:20.645177 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerName="nova-cell1-conductor-db-sync" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645253 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerName="nova-cell1-conductor-db-sync" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645558 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" containerName="nova-manage" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.645640 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" containerName="nova-cell1-conductor-db-sync" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.646545 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.647236 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f"} Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654569 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ppd\" (UniqueName: \"kubernetes.io/projected/50cbae0e-4bf9-41b0-8c87-b551f782aecf-kube-api-access-q7ppd\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.654960 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.655041 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92r4m\" (UniqueName: \"kubernetes.io/projected/04183e35-79b0-4c76-b538-b5b71299cd92-kube-api-access-92r4m\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.655131 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.655212 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04183e35-79b0-4c76-b538-b5b71299cd92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.696005 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.760820 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.761180 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ppd\" (UniqueName: \"kubernetes.io/projected/50cbae0e-4bf9-41b0-8c87-b551f782aecf-kube-api-access-q7ppd\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.762005 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.766569 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.767709 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50cbae0e-4bf9-41b0-8c87-b551f782aecf-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.777097 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:20 crc kubenswrapper[4898]: E0313 14:23:20.777757 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.800293 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ppd\" (UniqueName: \"kubernetes.io/projected/50cbae0e-4bf9-41b0-8c87-b551f782aecf-kube-api-access-q7ppd\") pod \"nova-cell1-conductor-0\" (UID: \"50cbae0e-4bf9-41b0-8c87-b551f782aecf\") " pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.862032 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.862474 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" containerID="cri-o://4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.862973 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" containerID="cri-o://6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.882179 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.882370 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" containerID="cri-o://a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.903226 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.903415 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" containerID="cri-o://ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.903978 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" containerID="cri-o://ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a" gracePeriod=30 Mar 13 14:23:20 crc kubenswrapper[4898]: I0313 14:23:20.948825 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:21 crc kubenswrapper[4898]: W0313 14:23:21.513832 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50cbae0e_4bf9_41b0_8c87_b551f782aecf.slice/crio-cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be WatchSource:0}: Error finding container cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be: Status 404 returned error can't find the container with id cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.514798 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.664833 4898 generic.go:334] "Generic (PLEG): container finished" podID="401e6738-93d7-40d4-867e-8c68437cbad3" containerID="4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe" exitCode=143 Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.664912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerDied","Data":"4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.666483 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668416 4898 generic.go:334] "Generic (PLEG): container finished" podID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerID="ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a" exitCode=0 Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668457 4898 generic.go:334] "Generic (PLEG): container finished" podID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerID="ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865" exitCode=143 Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668492 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerDied","Data":"ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.668512 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerDied","Data":"ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.677868 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"50cbae0e-4bf9-41b0-8c87-b551f782aecf","Type":"ContainerStarted","Data":"cdecfdf75d36df2ca849b21d053f771e4d9b2a154e706d876c8e32ef8ad578be"} Mar 13 14:23:21 crc kubenswrapper[4898]: I0313 14:23:21.954243 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005056 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005165 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005359 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005423 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") pod \"f31df8cc-85c8-4626-ab54-1a93d291f02d\" (UID: \"f31df8cc-85c8-4626-ab54-1a93d291f02d\") " Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.005610 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs" (OuterVolumeSpecName: "logs") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.012581 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31df8cc-85c8-4626-ab54-1a93d291f02d-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.017273 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h" (OuterVolumeSpecName: "kube-api-access-pvb8h") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "kube-api-access-pvb8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.079273 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.093501 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data" (OuterVolumeSpecName: "config-data") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.102010 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f31df8cc-85c8-4626-ab54-1a93d291f02d" (UID: "f31df8cc-85c8-4626-ab54-1a93d291f02d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115145 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115191 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115204 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31df8cc-85c8-4626-ab54-1a93d291f02d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.115212 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvb8h\" (UniqueName: \"kubernetes.io/projected/f31df8cc-85c8-4626-ab54-1a93d291f02d-kube-api-access-pvb8h\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.595328 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.692451 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f31df8cc-85c8-4626-ab54-1a93d291f02d","Type":"ContainerDied","Data":"113e1d3b19705c253801494357ca106e72aa1f2de77f8992627258135742aa53"} Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.692512 4898 scope.go:117] "RemoveContainer" containerID="ab8a9704929696234d2b66d5a8c2c31485aa1055e802133753ca009e70702f0a" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.692515 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.700356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"50cbae0e-4bf9-41b0-8c87-b551f782aecf","Type":"ContainerStarted","Data":"1d14da9968bc6e97bcfbe55c9caf15ce609e48be0d9dbf281507a3e38bdfb77b"} Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.703855 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9"} Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.724325 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.724298754 podStartE2EDuration="2.724298754s" podCreationTimestamp="2026-03-13 14:23:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:22.714270184 +0000 UTC m=+1637.715858423" watchObservedRunningTime="2026-03-13 14:23:22.724298754 +0000 UTC m=+1637.725886993" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.751238 4898 scope.go:117] "RemoveContainer" containerID="ebc50c2130d8c479a6a82fdc2a6a26b1d7c6228aebf42d75caeb8e90cffc4865" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.774811 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.800682 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.816114 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: E0313 14:23:22.817085 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817105 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" Mar 13 14:23:22 crc kubenswrapper[4898]: E0313 14:23:22.817128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817138 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817498 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-metadata" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.817528 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" containerName="nova-metadata-log" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.822319 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.825840 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.826214 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.841881 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.935996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936677 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936875 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:22 crc kubenswrapper[4898]: I0313 14:23:22.936920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.039757 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.039804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.039993 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.040041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.040074 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.040944 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.060104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.060160 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.060427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.062455 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"nova-metadata-0\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.139999 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.682310 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:23:23 crc kubenswrapper[4898]: W0313 14:23:23.685207 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8db736b_00b7_4251_a667_3b2138c6c928.slice/crio-2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca WatchSource:0}: Error finding container 2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca: Status 404 returned error can't find the container with id 2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.720408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerStarted","Data":"2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca"} Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.722021 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:23 crc kubenswrapper[4898]: I0313 14:23:23.758626 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31df8cc-85c8-4626-ab54-1a93d291f02d" path="/var/lib/kubelet/pods/f31df8cc-85c8-4626-ab54-1a93d291f02d/volumes" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.025365 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.025414 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.745329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerStarted","Data":"9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.745877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerStarted","Data":"c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.751565 4898 generic.go:334] "Generic (PLEG): container finished" podID="401e6738-93d7-40d4-867e-8c68437cbad3" containerID="6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838" exitCode=0 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.751644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerDied","Data":"6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.755865 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" containerID="cri-o://a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756040 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" containerID="cri-o://21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756098 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" containerID="cri-o://3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756146 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" containerID="cri-o://74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86" gracePeriod=30 Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756938 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerStarted","Data":"21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6"} Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.756987 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.822398 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.82237305 podStartE2EDuration="2.82237305s" podCreationTimestamp="2026-03-13 14:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:24.781381816 +0000 UTC m=+1639.782970065" watchObservedRunningTime="2026-03-13 14:23:24.82237305 +0000 UTC m=+1639.823961289" Mar 13 14:23:24 crc kubenswrapper[4898]: I0313 14:23:24.850008 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.75603343 podStartE2EDuration="6.849988827s" podCreationTimestamp="2026-03-13 14:23:18 +0000 UTC" firstStartedPulling="2026-03-13 14:23:19.422283516 +0000 UTC m=+1634.423871755" lastFinishedPulling="2026-03-13 14:23:23.516238913 +0000 UTC m=+1638.517827152" observedRunningTime="2026-03-13 14:23:24.801702803 +0000 UTC m=+1639.803291042" watchObservedRunningTime="2026-03-13 14:23:24.849988827 +0000 UTC m=+1639.851577056" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.007986 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133540 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133602 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133674 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.133806 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") pod \"401e6738-93d7-40d4-867e-8c68437cbad3\" (UID: \"401e6738-93d7-40d4-867e-8c68437cbad3\") " Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.134612 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs" (OuterVolumeSpecName: "logs") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.148118 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l" (OuterVolumeSpecName: "kube-api-access-2z75l") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "kube-api-access-2z75l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.168111 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.191940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data" (OuterVolumeSpecName: "config-data") pod "401e6738-93d7-40d4-867e-8c68437cbad3" (UID: "401e6738-93d7-40d4-867e-8c68437cbad3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238564 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238601 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401e6738-93d7-40d4-867e-8c68437cbad3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238614 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/401e6738-93d7-40d4-867e-8c68437cbad3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.238623 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z75l\" (UniqueName: \"kubernetes.io/projected/401e6738-93d7-40d4-867e-8c68437cbad3-kube-api-access-2z75l\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775044 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6" exitCode=0 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775083 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9" exitCode=2 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775092 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86" exitCode=0 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775144 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.775193 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780042 4898 generic.go:334] "Generic (PLEG): container finished" podID="0769b03d-29b4-4519-abc7-408431328276" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" exitCode=0 Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerDied","Data":"a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780683 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0769b03d-29b4-4519-abc7-408431328276","Type":"ContainerDied","Data":"8afa6be1221f1010388ebd2f7d552a490a868c166c3d8a6eee8fcc24b9095e1d"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.780778 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8afa6be1221f1010388ebd2f7d552a490a868c166c3d8a6eee8fcc24b9095e1d" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.784138 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.785746 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"401e6738-93d7-40d4-867e-8c68437cbad3","Type":"ContainerDied","Data":"33c80887acf46993b60950d3ec27771c17d7d31947de5c0902c67049c9927696"} Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.785916 4898 scope.go:117] "RemoveContainer" containerID="6a3356dec1914d7176270b5cf94ad8c6d3be50dc70db5f963643ba9ceaa22838" Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.802983 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.803371 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.803590 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.803618 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.882938 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.886506 4898 scope.go:117] "RemoveContainer" containerID="4172774e3d35f2447bfbdab8c2932d0b6b6936e1f5596cbc2f41a9105e3cedbe" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.909734 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.937839 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956155 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.956865 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956883 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.956944 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956950 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: E0313 14:23:25.956969 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.956975 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.957283 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-api" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.957310 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" containerName="nova-api-log" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.957329 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0769b03d-29b4-4519-abc7-408431328276" containerName="nova-scheduler-scheduler" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.959547 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:25 crc kubenswrapper[4898]: I0313 14:23:25.962312 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.003940 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.065527 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") pod \"0769b03d-29b4-4519-abc7-408431328276\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.065665 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") pod \"0769b03d-29b4-4519-abc7-408431328276\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.065878 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") pod \"0769b03d-29b4-4519-abc7-408431328276\" (UID: \"0769b03d-29b4-4519-abc7-408431328276\") " Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066195 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066303 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066396 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.066532 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.072614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp" (OuterVolumeSpecName: "kube-api-access-k2dlp") pod "0769b03d-29b4-4519-abc7-408431328276" (UID: "0769b03d-29b4-4519-abc7-408431328276"). InnerVolumeSpecName "kube-api-access-k2dlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.108735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0769b03d-29b4-4519-abc7-408431328276" (UID: "0769b03d-29b4-4519-abc7-408431328276"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.146413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data" (OuterVolumeSpecName: "config-data") pod "0769b03d-29b4-4519-abc7-408431328276" (UID: "0769b03d-29b4-4519-abc7-408431328276"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168357 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168739 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168765 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168804 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168864 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.168978 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2dlp\" (UniqueName: \"kubernetes.io/projected/0769b03d-29b4-4519-abc7-408431328276-kube-api-access-k2dlp\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.169009 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.169019 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0769b03d-29b4-4519-abc7-408431328276-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.172360 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.174509 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.185604 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"nova-api-0\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.285921 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.800708 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.806084 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.964989 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.983499 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.997985 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:26 crc kubenswrapper[4898]: I0313 14:23:26.999872 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.002470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.019602 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.091710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.091978 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.092083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.194404 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.194527 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.194567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.198360 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.198440 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.211668 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"nova-scheduler-0\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.331005 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.752858 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0769b03d-29b4-4519-abc7-408431328276" path="/var/lib/kubelet/pods/0769b03d-29b4-4519-abc7-408431328276/volumes" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.753853 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401e6738-93d7-40d4-867e-8c68437cbad3" path="/var/lib/kubelet/pods/401e6738-93d7-40d4-867e-8c68437cbad3/volumes" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.821433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerStarted","Data":"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d"} Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.821485 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerStarted","Data":"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4"} Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.821499 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerStarted","Data":"d9d2e156d0dffe3d5895df3f9ced875368d41389028a4139a826d6dfa5cc7fb1"} Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.845769 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8457381550000003 podStartE2EDuration="2.845738155s" podCreationTimestamp="2026-03-13 14:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:27.845233112 +0000 UTC m=+1642.846821381" watchObservedRunningTime="2026-03-13 14:23:27.845738155 +0000 UTC m=+1642.847326394" Mar 13 14:23:27 crc kubenswrapper[4898]: I0313 14:23:27.889682 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:23:28 crc kubenswrapper[4898]: I0313 14:23:28.836146 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerStarted","Data":"996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0"} Mar 13 14:23:28 crc kubenswrapper[4898]: I0313 14:23:28.836484 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerStarted","Data":"4fb43fc1071513c1a034ec9b0dda28c7b02d6ddc884f93f9664159fa9c0ff74c"} Mar 13 14:23:28 crc kubenswrapper[4898]: I0313 14:23:28.860442 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.860420735 podStartE2EDuration="2.860420735s" podCreationTimestamp="2026-03-13 14:23:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:28.856542494 +0000 UTC m=+1643.858130753" watchObservedRunningTime="2026-03-13 14:23:28.860420735 +0000 UTC m=+1643.862008984" Mar 13 14:23:30 crc kubenswrapper[4898]: I0313 14:23:30.855780 4898 generic.go:334] "Generic (PLEG): container finished" podID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerID="a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f" exitCode=0 Mar 13 14:23:30 crc kubenswrapper[4898]: I0313 14:23:30.855859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f"} Mar 13 14:23:30 crc kubenswrapper[4898]: I0313 14:23:30.986529 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.470989 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522021 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522209 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522258 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522402 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.522628 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") pod \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\" (UID: \"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5\") " Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.525331 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.525667 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.530137 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq" (OuterVolumeSpecName: "kube-api-access-pznzq") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "kube-api-access-pznzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.532189 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts" (OuterVolumeSpecName: "scripts") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.565437 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626708 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626750 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626819 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626832 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pznzq\" (UniqueName: \"kubernetes.io/projected/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-kube-api-access-pznzq\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.626843 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.641144 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.657472 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data" (OuterVolumeSpecName: "config-data") pod "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" (UID: "f08c3de0-9b21-4e4a-8811-b40fdb3b63c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.729176 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.729364 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.870124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f08c3de0-9b21-4e4a-8811-b40fdb3b63c5","Type":"ContainerDied","Data":"1e1a0e67a834a696d965eab6f5a0ade85145365bbc3c3bccceb0280541a0e5ed"} Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.870243 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.870259 4898 scope.go:117] "RemoveContainer" containerID="21fd8f0e04d5275c81dea1442286d2ffac0118f2bf21754e17a48e3332d618e6" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.899313 4898 scope.go:117] "RemoveContainer" containerID="3bed3d1e809327f9581d0f282bdba9c8f4826f7960e98f95fb41d7bf1c602dc9" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.909101 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.924276 4898 scope.go:117] "RemoveContainer" containerID="74f0d5a974f39e0ea6ad714deeedb29ad931cd8927cfe42cce60108477a37a86" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.932876 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.952296 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953736 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953770 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953796 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953804 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953821 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953828 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: E0313 14:23:31.953843 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.953851 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954216 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="proxy-httpd" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954259 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-central-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954275 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="sg-core" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.954295 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" containerName="ceilometer-notification-agent" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.958342 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.961204 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.963287 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.969991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:31 crc kubenswrapper[4898]: I0313 14:23:31.971738 4898 scope.go:117] "RemoveContainer" containerID="a52bb77ac792d2e28bb8d969def76aa20c8d38d8dddae8c183a41016f4efeb7f" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.038466 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039023 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039757 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039963 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.039996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142553 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142632 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142662 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142685 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142719 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142838 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.142866 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.143294 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.144565 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.150427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.151495 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.152242 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.152598 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.163750 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"ceilometer-0\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.288638 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.310081 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.311867 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.321626 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.323142 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.331306 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.331567 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.344377 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348672 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.348730 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.363068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.452140 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.452564 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.453828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.453883 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.455433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.455875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.481430 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"aodh-db-create-crnr4\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.489866 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"aodh-fc6a-account-create-update-cfl2q\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.648488 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.662115 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:32 crc kubenswrapper[4898]: W0313 14:23:32.862244 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee0030e_ceb6_41ff_97b2_6302e2bed961.slice/crio-7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79 WatchSource:0}: Error finding container 7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79: Status 404 returned error can't find the container with id 7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79 Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.862506 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:32 crc kubenswrapper[4898]: I0313 14:23:32.911180 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.140954 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.141005 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.181712 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.360841 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:23:33 crc kubenswrapper[4898]: W0313 14:23:33.379151 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d29378e_424d_4831_baf4_b59a75072097.slice/crio-986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a WatchSource:0}: Error finding container 986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a: Status 404 returned error can't find the container with id 986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.739882 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:33 crc kubenswrapper[4898]: E0313 14:23:33.740360 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.752606 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f08c3de0-9b21-4e4a-8811-b40fdb3b63c5" path="/var/lib/kubelet/pods/f08c3de0-9b21-4e4a-8811-b40fdb3b63c5/volumes" Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.923313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerStarted","Data":"4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.924334 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerStarted","Data":"986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.927123 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.929545 4898 generic.go:334] "Generic (PLEG): container finished" podID="e50eec10-99ce-4611-8cf4-8f4999146339" containerID="903eebacfbe4709488e6b56c6ba47deec7c9d806d35c4763b770e46f79ef165a" exitCode=0 Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.929642 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-crnr4" event={"ID":"e50eec10-99ce-4611-8cf4-8f4999146339","Type":"ContainerDied","Data":"903eebacfbe4709488e6b56c6ba47deec7c9d806d35c4763b770e46f79ef165a"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.929676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-crnr4" event={"ID":"e50eec10-99ce-4611-8cf4-8f4999146339","Type":"ContainerStarted","Data":"bee3105929f09bbd68f86f9a82c8813e95620fc80d9be672ee7f18dca5bd32d2"} Mar 13 14:23:33 crc kubenswrapper[4898]: I0313 14:23:33.944010 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-fc6a-account-create-update-cfl2q" podStartSLOduration=1.943992083 podStartE2EDuration="1.943992083s" podCreationTimestamp="2026-03-13 14:23:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:33.939738942 +0000 UTC m=+1648.941327211" watchObservedRunningTime="2026-03-13 14:23:33.943992083 +0000 UTC m=+1648.945580332" Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.165191 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.2:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.165494 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.2:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.944411 4898 generic.go:334] "Generic (PLEG): container finished" podID="4d29378e-424d-4831-baf4-b59a75072097" containerID="4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d" exitCode=0 Mar 13 14:23:34 crc kubenswrapper[4898]: I0313 14:23:34.944550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerDied","Data":"4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d"} Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.481150 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.594767 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") pod \"e50eec10-99ce-4611-8cf4-8f4999146339\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.595121 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") pod \"e50eec10-99ce-4611-8cf4-8f4999146339\" (UID: \"e50eec10-99ce-4611-8cf4-8f4999146339\") " Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.596793 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e50eec10-99ce-4611-8cf4-8f4999146339" (UID: "e50eec10-99ce-4611-8cf4-8f4999146339"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.611110 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt" (OuterVolumeSpecName: "kube-api-access-r2srt") pod "e50eec10-99ce-4611-8cf4-8f4999146339" (UID: "e50eec10-99ce-4611-8cf4-8f4999146339"). InnerVolumeSpecName "kube-api-access-r2srt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.698707 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e50eec10-99ce-4611-8cf4-8f4999146339-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.699032 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2srt\" (UniqueName: \"kubernetes.io/projected/e50eec10-99ce-4611-8cf4-8f4999146339-kube-api-access-r2srt\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.963195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.967391 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-crnr4" Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.967781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-crnr4" event={"ID":"e50eec10-99ce-4611-8cf4-8f4999146339","Type":"ContainerDied","Data":"bee3105929f09bbd68f86f9a82c8813e95620fc80d9be672ee7f18dca5bd32d2"} Mar 13 14:23:35 crc kubenswrapper[4898]: I0313 14:23:35.967819 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bee3105929f09bbd68f86f9a82c8813e95620fc80d9be672ee7f18dca5bd32d2" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.286617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.287045 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.538424 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.628116 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") pod \"4d29378e-424d-4831-baf4-b59a75072097\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.628216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") pod \"4d29378e-424d-4831-baf4-b59a75072097\" (UID: \"4d29378e-424d-4831-baf4-b59a75072097\") " Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.628753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d29378e-424d-4831-baf4-b59a75072097" (UID: "4d29378e-424d-4831-baf4-b59a75072097"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.641673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26" (OuterVolumeSpecName: "kube-api-access-x5n26") pod "4d29378e-424d-4831-baf4-b59a75072097" (UID: "4d29378e-424d-4831-baf4-b59a75072097"). InnerVolumeSpecName "kube-api-access-x5n26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.730543 4898 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d29378e-424d-4831-baf4-b59a75072097-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.730577 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5n26\" (UniqueName: \"kubernetes.io/projected/4d29378e-424d-4831-baf4-b59a75072097-kube-api-access-x5n26\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.983413 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.986732 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-fc6a-account-create-update-cfl2q" event={"ID":"4d29378e-424d-4831-baf4-b59a75072097","Type":"ContainerDied","Data":"986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a"} Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.987229 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="986f1ec2eac42d15ffcd477dd0aebc1a00f6cb5abbf86cdc061cc766a9aaa72a" Mar 13 14:23:36 crc kubenswrapper[4898]: I0313 14:23:36.987553 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-fc6a-account-create-update-cfl2q" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.331664 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.369266 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.369267 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.3:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 14:23:37 crc kubenswrapper[4898]: I0313 14:23:37.379589 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 14:23:38 crc kubenswrapper[4898]: I0313 14:23:38.041138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 14:23:39 crc kubenswrapper[4898]: I0313 14:23:39.011164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerStarted","Data":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} Mar 13 14:23:39 crc kubenswrapper[4898]: I0313 14:23:39.011776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:39 crc kubenswrapper[4898]: I0313 14:23:39.042363 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.997324747 podStartE2EDuration="8.042329424s" podCreationTimestamp="2026-03-13 14:23:31 +0000 UTC" firstStartedPulling="2026-03-13 14:23:32.869195151 +0000 UTC m=+1647.870783400" lastFinishedPulling="2026-03-13 14:23:37.914199818 +0000 UTC m=+1652.915788077" observedRunningTime="2026-03-13 14:23:39.036531143 +0000 UTC m=+1654.038119382" watchObservedRunningTime="2026-03-13 14:23:39.042329424 +0000 UTC m=+1654.043917683" Mar 13 14:23:41 crc kubenswrapper[4898]: I0313 14:23:41.140153 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:23:41 crc kubenswrapper[4898]: I0313 14:23:41.140477 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.643088 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:23:42 crc kubenswrapper[4898]: E0313 14:23:42.644732 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d29378e-424d-4831-baf4-b59a75072097" containerName="mariadb-account-create-update" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.644840 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d29378e-424d-4831-baf4-b59a75072097" containerName="mariadb-account-create-update" Mar 13 14:23:42 crc kubenswrapper[4898]: E0313 14:23:42.648472 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" containerName="mariadb-database-create" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.648672 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" containerName="mariadb-database-create" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.649261 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" containerName="mariadb-database-create" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.649359 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d29378e-424d-4831-baf4-b59a75072097" containerName="mariadb-account-create-update" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.650523 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.655962 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.657095 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.657596 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.657667 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.662166 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.795248 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.795322 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.795377 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.796317 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898682 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898752 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898805 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.898874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.905434 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.905963 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.910674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.918846 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"aodh-db-sync-rr9kw\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:42 crc kubenswrapper[4898]: I0313 14:23:42.975987 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.146276 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.146966 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.155618 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.516055 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.691323 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.821383 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") pod \"5edbf12d-a655-4822-98da-9719c131fa14\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.821491 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") pod \"5edbf12d-a655-4822-98da-9719c131fa14\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.821612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") pod \"5edbf12d-a655-4822-98da-9719c131fa14\" (UID: \"5edbf12d-a655-4822-98da-9719c131fa14\") " Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.826745 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h" (OuterVolumeSpecName: "kube-api-access-p4l9h") pod "5edbf12d-a655-4822-98da-9719c131fa14" (UID: "5edbf12d-a655-4822-98da-9719c131fa14"). InnerVolumeSpecName "kube-api-access-p4l9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.851946 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data" (OuterVolumeSpecName: "config-data") pod "5edbf12d-a655-4822-98da-9719c131fa14" (UID: "5edbf12d-a655-4822-98da-9719c131fa14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.853648 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5edbf12d-a655-4822-98da-9719c131fa14" (UID: "5edbf12d-a655-4822-98da-9719c131fa14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.924716 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.924789 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l9h\" (UniqueName: \"kubernetes.io/projected/5edbf12d-a655-4822-98da-9719c131fa14-kube-api-access-p4l9h\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:43 crc kubenswrapper[4898]: I0313 14:23:43.924804 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5edbf12d-a655-4822-98da-9719c131fa14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.083453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerStarted","Data":"28400347fb0bff37d2a3d6816574065a1398407aac258da24130e69501fc45b8"} Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085490 4898 generic.go:334] "Generic (PLEG): container finished" podID="5edbf12d-a655-4822-98da-9719c131fa14" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" exitCode=137 Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085535 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerDied","Data":"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430"} Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085578 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"5edbf12d-a655-4822-98da-9719c131fa14","Type":"ContainerDied","Data":"7a4153aaa3cfe90175a3e8e41cb9a73f5056ef4defe81a11d1b496b7bcdcc9b6"} Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.085625 4898 scope.go:117] "RemoveContainer" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.114084 4898 scope.go:117] "RemoveContainer" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" Mar 13 14:23:44 crc kubenswrapper[4898]: E0313 14:23:44.118091 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430\": container with ID starting with 132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430 not found: ID does not exist" containerID="132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.118146 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430"} err="failed to get container status \"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430\": rpc error: code = NotFound desc = could not find container \"132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430\": container with ID starting with 132b3a0f7e5ae56889214fa156373da26bc4a38bf3f78bbfa0992ef5f518c430 not found: ID does not exist" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.144962 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.162922 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.190110 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: E0313 14:23:44.190828 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.190846 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.191197 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5edbf12d-a655-4822-98da-9719c131fa14" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.192311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.197452 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.197615 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.198219 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.212137 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232129 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232430 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pq2h\" (UniqueName: \"kubernetes.io/projected/041221f0-b346-4310-ab8e-a8f2440c6034-kube-api-access-5pq2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232529 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.232844 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.286301 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.286355 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.335941 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336193 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336280 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.336400 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pq2h\" (UniqueName: \"kubernetes.io/projected/041221f0-b346-4310-ab8e-a8f2440c6034-kube-api-access-5pq2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.341114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.341221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.341710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.344746 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/041221f0-b346-4310-ab8e-a8f2440c6034-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.355491 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pq2h\" (UniqueName: \"kubernetes.io/projected/041221f0-b346-4310-ab8e-a8f2440c6034-kube-api-access-5pq2h\") pod \"nova-cell1-novncproxy-0\" (UID: \"041221f0-b346-4310-ab8e-a8f2440c6034\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.513408 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:44 crc kubenswrapper[4898]: I0313 14:23:44.985049 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 14:23:44 crc kubenswrapper[4898]: W0313 14:23:44.989075 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod041221f0_b346_4310_ab8e_a8f2440c6034.slice/crio-b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476 WatchSource:0}: Error finding container b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476: Status 404 returned error can't find the container with id b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476 Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.102158 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"041221f0-b346-4310-ab8e-a8f2440c6034","Type":"ContainerStarted","Data":"b8e405b492c9d7fd596598cf904fae808606c285c335d07a03fb9f84f06a5476"} Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.106804 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.753299 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:45 crc kubenswrapper[4898]: E0313 14:23:45.753889 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:45 crc kubenswrapper[4898]: I0313 14:23:45.755368 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5edbf12d-a655-4822-98da-9719c131fa14" path="/var/lib/kubelet/pods/5edbf12d-a655-4822-98da-9719c131fa14/volumes" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.116898 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"041221f0-b346-4310-ab8e-a8f2440c6034","Type":"ContainerStarted","Data":"52d82d263f9909335f5d4fa501a1e3625477391dab284f73ecfc7437d9706252"} Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.138984 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.138954519 podStartE2EDuration="2.138954519s" podCreationTimestamp="2026-03-13 14:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:46.136573267 +0000 UTC m=+1661.138161506" watchObservedRunningTime="2026-03-13 14:23:46.138954519 +0000 UTC m=+1661.140542778" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.289924 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.290854 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:23:46 crc kubenswrapper[4898]: I0313 14:23:46.312322 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.131083 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.301805 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.304348 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.316217 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417487 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417560 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417677 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.417749 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519621 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519826 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519874 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.519927 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.522790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.523356 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.524309 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.525334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.525516 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.526476 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.549380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"dnsmasq-dns-6b7bbf7cf9-bsswg\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:47 crc kubenswrapper[4898]: I0313 14:23:47.632137 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:48 crc kubenswrapper[4898]: W0313 14:23:48.854288 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1d23b78_5402_47e0_8af6_851fcc71be6b.slice/crio-0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977 WatchSource:0}: Error finding container 0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977: Status 404 returned error can't find the container with id 0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977 Mar 13 14:23:48 crc kubenswrapper[4898]: I0313 14:23:48.855522 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.176039 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerStarted","Data":"0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977"} Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.186975 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerStarted","Data":"80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42"} Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.225826 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-rr9kw" podStartSLOduration=2.357548524 podStartE2EDuration="7.225776961s" podCreationTimestamp="2026-03-13 14:23:42 +0000 UTC" firstStartedPulling="2026-03-13 14:23:43.489451238 +0000 UTC m=+1658.491039477" lastFinishedPulling="2026-03-13 14:23:48.357679675 +0000 UTC m=+1663.359267914" observedRunningTime="2026-03-13 14:23:49.215443143 +0000 UTC m=+1664.217031382" watchObservedRunningTime="2026-03-13 14:23:49.225776961 +0000 UTC m=+1664.227365220" Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.515994 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.703551 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.899228 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.899490 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" containerID="cri-o://046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.900049 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" containerID="cri-o://861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.900309 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" containerID="cri-o://ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.900417 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" containerID="cri-o://f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" gracePeriod=30 Mar 13 14:23:49 crc kubenswrapper[4898]: I0313 14:23:49.908478 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.5:3000/\": EOF" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206472 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" exitCode=0 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206736 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" exitCode=2 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.206784 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.210650 4898 generic.go:334] "Generic (PLEG): container finished" podID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerID="7bda3f3e696b51ddd844a86367f80dfd5aee6675d49dc2996a124d70519d4d20" exitCode=0 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.210702 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerDied","Data":"7bda3f3e696b51ddd844a86367f80dfd5aee6675d49dc2996a124d70519d4d20"} Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.211014 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" containerID="cri-o://534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" gracePeriod=30 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.211069 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" containerID="cri-o://1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" gracePeriod=30 Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.884077 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930408 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930451 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930561 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930670 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930728 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930774 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.930798 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") pod \"fee0030e-ceb6-41ff-97b2-6302e2bed961\" (UID: \"fee0030e-ceb6-41ff-97b2-6302e2bed961\") " Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.936510 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.938150 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.941384 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm" (OuterVolumeSpecName: "kube-api-access-k9lvm") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "kube-api-access-k9lvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.947090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts" (OuterVolumeSpecName: "scripts") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:50 crc kubenswrapper[4898]: I0313 14:23:50.980524 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.033826 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034100 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034170 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9lvm\" (UniqueName: \"kubernetes.io/projected/fee0030e-ceb6-41ff-97b2-6302e2bed961-kube-api-access-k9lvm\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034228 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fee0030e-ceb6-41ff-97b2-6302e2bed961-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.034619 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.070839 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.124602 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data" (OuterVolumeSpecName: "config-data") pod "fee0030e-ceb6-41ff-97b2-6302e2bed961" (UID: "fee0030e-ceb6-41ff-97b2-6302e2bed961"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.136975 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.137016 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee0030e-ceb6-41ff-97b2-6302e2bed961-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.224332 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerID="80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42" exitCode=0 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.224403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerDied","Data":"80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.226977 4898 generic.go:334] "Generic (PLEG): container finished" podID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" exitCode=143 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.227030 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerDied","Data":"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230257 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" exitCode=0 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230472 4898 generic.go:334] "Generic (PLEG): container finished" podID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" exitCode=0 Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230349 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230377 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230685 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fee0030e-ceb6-41ff-97b2-6302e2bed961","Type":"ContainerDied","Data":"7938adbffcaeda1984e7ee26b38983054f8dc341e6b57108f491823714875f79"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.230705 4898 scope.go:117] "RemoveContainer" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.233815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerStarted","Data":"4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b"} Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.234052 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.282802 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" podStartSLOduration=4.282784049 podStartE2EDuration="4.282784049s" podCreationTimestamp="2026-03-13 14:23:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:51.271503277 +0000 UTC m=+1666.273091536" watchObservedRunningTime="2026-03-13 14:23:51.282784049 +0000 UTC m=+1666.284372288" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.332532 4898 scope.go:117] "RemoveContainer" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.342336 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.360284 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.369286 4898 scope.go:117] "RemoveContainer" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.378619 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379236 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379253 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379261 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379267 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379281 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379286 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.379323 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379330 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379538 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="proxy-httpd" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379568 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-central-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379577 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="sg-core" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.379591 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" containerName="ceilometer-notification-agent" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.381753 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.385105 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.391395 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.404626 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.410608 4898 scope.go:117] "RemoveContainer" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.437636 4898 scope.go:117] "RemoveContainer" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.438144 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": container with ID starting with 861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61 not found: ID does not exist" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438174 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} err="failed to get container status \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": rpc error: code = NotFound desc = could not find container \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": container with ID starting with 861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438193 4898 scope.go:117] "RemoveContainer" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.438487 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": container with ID starting with ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772 not found: ID does not exist" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438614 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} err="failed to get container status \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": rpc error: code = NotFound desc = could not find container \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": container with ID starting with ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.438726 4898 scope.go:117] "RemoveContainer" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.439096 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": container with ID starting with f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e not found: ID does not exist" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439120 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} err="failed to get container status \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": rpc error: code = NotFound desc = could not find container \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": container with ID starting with f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439133 4898 scope.go:117] "RemoveContainer" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: E0313 14:23:51.439385 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": container with ID starting with 046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f not found: ID does not exist" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439408 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} err="failed to get container status \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": rpc error: code = NotFound desc = could not find container \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": container with ID starting with 046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439423 4898 scope.go:117] "RemoveContainer" containerID="861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439581 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61"} err="failed to get container status \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": rpc error: code = NotFound desc = could not find container \"861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61\": container with ID starting with 861e63df6a1b41636fdd186717e471c70d712870d053d2698de97221e567ca61 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439598 4898 scope.go:117] "RemoveContainer" containerID="ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439739 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772"} err="failed to get container status \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": rpc error: code = NotFound desc = could not find container \"ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772\": container with ID starting with ff10f66943efcd1011007ca068c3ebdda08f08d6f697d108469f59c034484772 not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439753 4898 scope.go:117] "RemoveContainer" containerID="f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439889 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e"} err="failed to get container status \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": rpc error: code = NotFound desc = could not find container \"f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e\": container with ID starting with f1024bead8638fdc2067a700f701c8aec937fe95f344f02a58a6153377872c8e not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.439920 4898 scope.go:117] "RemoveContainer" containerID="046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.440047 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f"} err="failed to get container status \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": rpc error: code = NotFound desc = could not find container \"046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f\": container with ID starting with 046f62ad233bcc6087ca907e01b7e7dad28f8073971dd32bda95b8c2d2d58d3f not found: ID does not exist" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442431 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442513 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442572 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442592 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442679 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.442710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.545410 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.545926 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546094 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546369 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546510 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.546683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.547500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.548518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.551935 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.552518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.553005 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.553281 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.570518 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"ceilometer-0\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.713119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:23:51 crc kubenswrapper[4898]: I0313 14:23:51.753122 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee0030e-ceb6-41ff-97b2-6302e2bed961" path="/var/lib/kubelet/pods/fee0030e-ceb6-41ff-97b2-6302e2bed961/volumes" Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.232819 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.267149 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"3c338bdcaa4d57b4416d9ffc2b822eddf6cb4b810f647a605061d6a63e5367e7"} Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.334623 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.791337 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.986138 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.986891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.987160 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.987226 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") pod \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\" (UID: \"6ed6478d-e1a3-4587-813f-222e6c4e54d7\") " Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.991772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts" (OuterVolumeSpecName: "scripts") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:52 crc kubenswrapper[4898]: I0313 14:23:52.992084 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8" (OuterVolumeSpecName: "kube-api-access-9tdv8") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "kube-api-access-9tdv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.020252 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.036673 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data" (OuterVolumeSpecName: "config-data") pod "6ed6478d-e1a3-4587-813f-222e6c4e54d7" (UID: "6ed6478d-e1a3-4587-813f-222e6c4e54d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090423 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090460 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090471 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed6478d-e1a3-4587-813f-222e6c4e54d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.090482 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tdv8\" (UniqueName: \"kubernetes.io/projected/6ed6478d-e1a3-4587-813f-222e6c4e54d7-kube-api-access-9tdv8\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.281247 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813"} Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.283118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rr9kw" event={"ID":"6ed6478d-e1a3-4587-813f-222e6c4e54d7","Type":"ContainerDied","Data":"28400347fb0bff37d2a3d6816574065a1398407aac258da24130e69501fc45b8"} Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.283161 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28400347fb0bff37d2a3d6816574065a1398407aac258da24130e69501fc45b8" Mar 13 14:23:53 crc kubenswrapper[4898]: I0313 14:23:53.283166 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rr9kw" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.212060 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.296133 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de"} Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298260 4898 generic.go:334] "Generic (PLEG): container finished" podID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" exitCode=0 Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerDied","Data":"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d"} Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298332 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91d28474-f268-4ecf-96b7-5a5007e715c3","Type":"ContainerDied","Data":"d9d2e156d0dffe3d5895df3f9ced875368d41389028a4139a826d6dfa5cc7fb1"} Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298348 4898 scope.go:117] "RemoveContainer" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.298506 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.318709 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.319027 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.319068 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.319188 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") pod \"91d28474-f268-4ecf-96b7-5a5007e715c3\" (UID: \"91d28474-f268-4ecf-96b7-5a5007e715c3\") " Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.320290 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs" (OuterVolumeSpecName: "logs") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.325933 4898 scope.go:117] "RemoveContainer" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.336797 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk" (OuterVolumeSpecName: "kube-api-access-chxlk") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "kube-api-access-chxlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.356090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.371874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data" (OuterVolumeSpecName: "config-data") pod "91d28474-f268-4ecf-96b7-5a5007e715c3" (UID: "91d28474-f268-4ecf-96b7-5a5007e715c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423832 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423866 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91d28474-f268-4ecf-96b7-5a5007e715c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423879 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chxlk\" (UniqueName: \"kubernetes.io/projected/91d28474-f268-4ecf-96b7-5a5007e715c3-kube-api-access-chxlk\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.423891 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91d28474-f268-4ecf-96b7-5a5007e715c3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.466937 4898 scope.go:117] "RemoveContainer" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.467411 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d\": container with ID starting with 1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d not found: ID does not exist" containerID="1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.467449 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d"} err="failed to get container status \"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d\": rpc error: code = NotFound desc = could not find container \"1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d\": container with ID starting with 1ac26ed6febdb8a025711b7fb70ecf717c1b07973758b6080092505756042b5d not found: ID does not exist" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.467472 4898 scope.go:117] "RemoveContainer" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.467825 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4\": container with ID starting with 534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4 not found: ID does not exist" containerID="534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.467874 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4"} err="failed to get container status \"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4\": rpc error: code = NotFound desc = could not find container \"534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4\": container with ID starting with 534def4acabc310f4b2d065b3a04508595fe368f3532f51136e25ae88c3877e4 not found: ID does not exist" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.518272 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.558127 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.649100 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.678331 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687027 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.687576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687613 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.687645 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerName="aodh-db-sync" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687651 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerName="aodh-db-sync" Mar 13 14:23:54 crc kubenswrapper[4898]: E0313 14:23:54.687661 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.687669 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.688042 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-log" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.688078 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" containerName="nova-api-api" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.688092 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" containerName="aodh-db-sync" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.689452 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.694066 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.694220 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.697409 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.697621 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737608 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737707 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737734 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.737822 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.738114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840723 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840831 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840854 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.840965 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.841004 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.841069 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.841751 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.846835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.847664 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.851133 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.853351 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:54 crc kubenswrapper[4898]: I0313 14:23:54.864115 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"nova-api-0\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " pod="openstack/nova-api-0" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.014404 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.312054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b"} Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.331235 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.572028 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.615857 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.627540 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.629779 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.630618 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.656198 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668022 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668558 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.668655 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.761189 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d28474-f268-4ecf-96b7-5a5007e715c3" path="/var/lib/kubelet/pods/91d28474-f268-4ecf-96b7-5a5007e715c3/volumes" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770648 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770758 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770791 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.770979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.780015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.783662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.783748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.793540 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"nova-cell1-cell-mapping-tqzkv\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:55 crc kubenswrapper[4898]: I0313 14:23:55.953516 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.328469 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerStarted","Data":"7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4"} Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.328701 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerStarted","Data":"61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94"} Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.328712 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerStarted","Data":"e928948e8aff5ca3e9c4f8ba788b16341b26756ed4665c742a244420ba53b0dc"} Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.359107 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.359089339 podStartE2EDuration="2.359089339s" podCreationTimestamp="2026-03-13 14:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:56.353550155 +0000 UTC m=+1671.355138394" watchObservedRunningTime="2026-03-13 14:23:56.359089339 +0000 UTC m=+1671.360677578" Mar 13 14:23:56 crc kubenswrapper[4898]: W0313 14:23:56.482338 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb3eb7a_2c0d_42d3_9d61_b3ae21863f53.slice/crio-0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409 WatchSource:0}: Error finding container 0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409: Status 404 returned error can't find the container with id 0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409 Mar 13 14:23:56 crc kubenswrapper[4898]: I0313 14:23:56.489130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.362934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerStarted","Data":"b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b"} Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363482 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" containerID="cri-o://8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363647 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363782 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" containerID="cri-o://b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363853 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" containerID="cri-o://ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.363921 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" containerID="cri-o://d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de" gracePeriod=30 Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.367198 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.370871 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.374514 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.374849 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.378309 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.383453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerStarted","Data":"390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690"} Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.384676 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerStarted","Data":"0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409"} Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.413751 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.415742 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.415838 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.415873 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.416213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.432209 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.603656377 podStartE2EDuration="6.432184025s" podCreationTimestamp="2026-03-13 14:23:51 +0000 UTC" firstStartedPulling="2026-03-13 14:23:52.251113657 +0000 UTC m=+1667.252701916" lastFinishedPulling="2026-03-13 14:23:56.079641325 +0000 UTC m=+1671.081229564" observedRunningTime="2026-03-13 14:23:57.407281759 +0000 UTC m=+1672.408869998" watchObservedRunningTime="2026-03-13 14:23:57.432184025 +0000 UTC m=+1672.433772264" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.507290 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tqzkv" podStartSLOduration=2.507248014 podStartE2EDuration="2.507248014s" podCreationTimestamp="2026-03-13 14:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:23:57.482110511 +0000 UTC m=+1672.483698760" watchObservedRunningTime="2026-03-13 14:23:57.507248014 +0000 UTC m=+1672.508836263" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.518674 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.520454 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.520586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.520669 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.527937 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.531404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.532341 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.541947 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"aodh-0\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.647089 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.720951 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.832719 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:57 crc kubenswrapper[4898]: I0313 14:23:57.832977 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" containerID="cri-o://7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea" gracePeriod=10 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398218 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b" exitCode=0 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398698 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b" exitCode=2 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398708 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de" exitCode=0 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398710 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.398779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.401422 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerID="7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea" exitCode=0 Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.401969 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerDied","Data":"7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.401997 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" event={"ID":"ee82d4ec-b565-40b8-b878-2574487d7e9d","Type":"ContainerDied","Data":"7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425"} Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.402011 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a6f36f785b386fbe1a7acf818da5c2bcd85dcd2bb3cbfac1d54dd77d4a14425" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.503948 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.570165 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.666704 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.666801 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667029 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667058 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667219 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.667256 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") pod \"ee82d4ec-b565-40b8-b878-2574487d7e9d\" (UID: \"ee82d4ec-b565-40b8-b878-2574487d7e9d\") " Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.678942 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf" (OuterVolumeSpecName: "kube-api-access-rqwkf") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "kube-api-access-rqwkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.741747 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:23:58 crc kubenswrapper[4898]: E0313 14:23:58.742065 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.748003 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.756196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.770358 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775216 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqwkf\" (UniqueName: \"kubernetes.io/projected/ee82d4ec-b565-40b8-b878-2574487d7e9d-kube-api-access-rqwkf\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775247 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775257 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.775266 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.792940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config" (OuterVolumeSpecName: "config") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.835475 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee82d4ec-b565-40b8-b878-2574487d7e9d" (UID: "ee82d4ec-b565-40b8-b878-2574487d7e9d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.878513 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:58 crc kubenswrapper[4898]: I0313 14:23:58.878560 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee82d4ec-b565-40b8-b878-2574487d7e9d-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.413028 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4rc67" Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.413035 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5"} Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.413482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"ded3b65c989e6a6e858ee713ff395a11604658cb153b63189d68172abd0b0293"} Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.459149 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.478531 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4rc67"] Mar 13 14:23:59 crc kubenswrapper[4898]: I0313 14:23:59.754390 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" path="/var/lib/kubelet/pods/ee82d4ec-b565-40b8-b878-2574487d7e9d/volumes" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.134987 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:24:00 crc kubenswrapper[4898]: E0313 14:24:00.136454 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="init" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.136482 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="init" Mar 13 14:24:00 crc kubenswrapper[4898]: E0313 14:24:00.136514 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.136522 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.136811 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee82d4ec-b565-40b8-b878-2574487d7e9d" containerName="dnsmasq-dns" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.137724 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.142161 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.142195 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.142259 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.154691 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.208311 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"auto-csr-approver-29556864-bvpnd\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.309879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"auto-csr-approver-29556864-bvpnd\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.345203 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"auto-csr-approver-29556864-bvpnd\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.426276 4898 generic.go:334] "Generic (PLEG): container finished" podID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerID="8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813" exitCode=0 Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.426312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813"} Mar 13 14:24:00 crc kubenswrapper[4898]: I0313 14:24:00.473641 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.093986 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.230971 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.231063 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.231103 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.231844 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.232016 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.232253 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.232421 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") pod \"15a748d6-879c-48aa-99d0-b4a02dcfb640\" (UID: \"15a748d6-879c-48aa-99d0-b4a02dcfb640\") " Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.233700 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.235138 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.235292 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.235313 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/15a748d6-879c-48aa-99d0-b4a02dcfb640-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.245557 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts" (OuterVolumeSpecName: "scripts") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.245753 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk" (OuterVolumeSpecName: "kube-api-access-xzkqk") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "kube-api-access-xzkqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.294132 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.303603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:24:01 crc kubenswrapper[4898]: W0313 14:24:01.304361 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e5f381c_bbd8_40d9_8c76_efee5fb7023a.slice/crio-568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751 WatchSource:0}: Error finding container 568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751: Status 404 returned error can't find the container with id 568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751 Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.337920 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzkqk\" (UniqueName: \"kubernetes.io/projected/15a748d6-879c-48aa-99d0-b4a02dcfb640-kube-api-access-xzkqk\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.337952 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.337960 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.349999 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.409412 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data" (OuterVolumeSpecName: "config-data") pod "15a748d6-879c-48aa-99d0-b4a02dcfb640" (UID: "15a748d6-879c-48aa-99d0-b4a02dcfb640"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411115 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411719 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411742 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411761 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411768 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411804 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411810 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: E0313 14:24:01.411822 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.411828 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417420 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="proxy-httpd" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417454 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-central-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417468 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="sg-core" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.417493 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" containerName="ceilometer-notification-agent" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.419592 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.423765 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.444310 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.444347 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a748d6-879c-48aa-99d0-b4a02dcfb640-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.478882 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556"} Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.495552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"15a748d6-879c-48aa-99d0-b4a02dcfb640","Type":"ContainerDied","Data":"3c338bdcaa4d57b4416d9ffc2b822eddf6cb4b810f647a605061d6a63e5367e7"} Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.495616 4898 scope.go:117] "RemoveContainer" containerID="b50ed3e3e658824fabe90b9ab722bb9a4f5508339b4f447ce2f6f6c4d2c6d70b" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.495820 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.506423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerStarted","Data":"568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751"} Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.528165 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.543091 4898 scope.go:117] "RemoveContainer" containerID="ebded2a2ffeeed33539cde9ae30e2e99c622582f6a03504795768afaee07448b" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.546449 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.546668 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.546723 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.569131 4898 scope.go:117] "RemoveContainer" containerID="d3f13a9a6e980be00fa109701dade3412bbacdebaa6ebaabd90e68e043d3f2de" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.579965 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.591954 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.604165 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.606518 4898 scope.go:117] "RemoveContainer" containerID="8580340e4db9b90d9c91bc7b25a0fe1542d88b2764af7feeb78e17278f5ad813" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.608117 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.611726 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.612117 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.616324 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.648393 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.648463 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.648551 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.649425 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.649488 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.704767 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"certified-operators-plhgx\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750176 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750369 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750439 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750483 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.750646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.751102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.751285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.762978 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a748d6-879c-48aa-99d0-b4a02dcfb640" path="/var/lib/kubelet/pods/15a748d6-879c-48aa-99d0-b4a02dcfb640/volumes" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.798916 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.852879 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.852953 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853016 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853078 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853112 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853138 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.853188 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.854155 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.854173 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.857616 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.859686 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.860790 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.865100 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.874205 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"ceilometer-0\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " pod="openstack/ceilometer-0" Mar 13 14:24:01 crc kubenswrapper[4898]: I0313 14:24:01.929604 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:02 crc kubenswrapper[4898]: I0313 14:24:02.381306 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:02 crc kubenswrapper[4898]: I0313 14:24:02.484873 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:02 crc kubenswrapper[4898]: W0313 14:24:02.642035 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ef06426_d2da_4ad2_8168_1ca91c9ca2a7.slice/crio-14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f WatchSource:0}: Error finding container 14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f: Status 404 returned error can't find the container with id 14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f Mar 13 14:24:02 crc kubenswrapper[4898]: W0313 14:24:02.642574 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75348dc_b6ff_43ff_bd9a_d84c91f23ea8.slice/crio-357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91 WatchSource:0}: Error finding container 357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91: Status 404 returned error can't find the container with id 357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91 Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.536757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.539866 4898 generic.go:334] "Generic (PLEG): container finished" podID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerID="390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690" exitCode=0 Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.539934 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerDied","Data":"390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.546168 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.546217 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.551107 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerStarted","Data":"4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.556929 4898 generic.go:334] "Generic (PLEG): container finished" podID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" exitCode=0 Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.556972 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.557018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerStarted","Data":"357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91"} Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.613674 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" podStartSLOduration=2.157100702 podStartE2EDuration="3.613651934s" podCreationTimestamp="2026-03-13 14:24:00 +0000 UTC" firstStartedPulling="2026-03-13 14:24:01.307448645 +0000 UTC m=+1676.309036874" lastFinishedPulling="2026-03-13 14:24:02.763999877 +0000 UTC m=+1677.765588106" observedRunningTime="2026-03-13 14:24:03.605000489 +0000 UTC m=+1678.606588748" watchObservedRunningTime="2026-03-13 14:24:03.613651934 +0000 UTC m=+1678.615240173" Mar 13 14:24:03 crc kubenswrapper[4898]: I0313 14:24:03.702641 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:04 crc kubenswrapper[4898]: I0313 14:24:04.579074 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.015524 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.015993 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.222517 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.277793 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.278018 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.278074 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.278095 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") pod \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\" (UID: \"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53\") " Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.284549 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts" (OuterVolumeSpecName: "scripts") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.308047 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst" (OuterVolumeSpecName: "kube-api-access-phtst") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "kube-api-access-phtst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.378180 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.378231 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data" (OuterVolumeSpecName: "config-data") pod "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" (UID: "bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.380995 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phtst\" (UniqueName: \"kubernetes.io/projected/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-kube-api-access-phtst\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.381021 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.381033 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.381042 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.594992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tqzkv" event={"ID":"bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53","Type":"ContainerDied","Data":"0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.595310 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0634c5ce6c0d169237c2c0e4d4768a8bd12dada932f968be1cb2685e61f49409" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.595167 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tqzkv" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.607037 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.612250 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerID="4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4" exitCode=0 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.612364 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerDied","Data":"4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.617273 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerStarted","Data":"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerStarted","Data":"e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb"} Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621417 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" containerID="cri-o://1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621592 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" containerID="cri-o://e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.621735 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" containerID="cri-o://e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.622040 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" containerID="cri-o://9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.683977 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.2397751599999998 podStartE2EDuration="8.683954037s" podCreationTimestamp="2026-03-13 14:23:57 +0000 UTC" firstStartedPulling="2026-03-13 14:23:58.493416325 +0000 UTC m=+1673.495004564" lastFinishedPulling="2026-03-13 14:24:04.937595202 +0000 UTC m=+1679.939183441" observedRunningTime="2026-03-13 14:24:05.671259878 +0000 UTC m=+1680.672848137" watchObservedRunningTime="2026-03-13 14:24:05.683954037 +0000 UTC m=+1680.685542276" Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.875550 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.875773 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" containerID="cri-o://61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.876244 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" containerID="cri-o://7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.907359 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.907795 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" containerID="cri-o://996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.961971 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.962599 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" containerID="cri-o://c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78" gracePeriod=30 Mar 13 14:24:05 crc kubenswrapper[4898]: I0313 14:24:05.963250 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" containerID="cri-o://9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c" gracePeriod=30 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.046107 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.12:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.046413 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.12:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.632983 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8db736b-00b7-4251-a667-3b2138c6c928" containerID="c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78" exitCode=143 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.633065 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerDied","Data":"c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635609 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c" exitCode=0 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635630 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556" exitCode=0 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635642 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5" exitCode=0 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635666 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635735 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.635750 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5"} Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.640342 4898 generic.go:334] "Generic (PLEG): container finished" podID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerID="61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94" exitCode=143 Mar 13 14:24:06 crc kubenswrapper[4898]: I0313 14:24:06.640638 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerDied","Data":"61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94"} Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.233159 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.334106 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.335750 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.336613 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") pod \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\" (UID: \"4e5f381c-bbd8-40d9-8c76-efee5fb7023a\") " Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.339118 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 14:24:07 crc kubenswrapper[4898]: E0313 14:24:07.339159 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.341019 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9" (OuterVolumeSpecName: "kube-api-access-59tc9") pod "4e5f381c-bbd8-40d9-8c76-efee5fb7023a" (UID: "4e5f381c-bbd8-40d9-8c76-efee5fb7023a"). InnerVolumeSpecName "kube-api-access-59tc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.439664 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59tc9\" (UniqueName: \"kubernetes.io/projected/4e5f381c-bbd8-40d9-8c76-efee5fb7023a-kube-api-access-59tc9\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.657139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" event={"ID":"4e5f381c-bbd8-40d9-8c76-efee5fb7023a","Type":"ContainerDied","Data":"568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751"} Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.657506 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568aace14eba72f5295e81bc2141ff4ab6bf2d47b3914a6e46161ad7ea2b9751" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.657388 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556864-bvpnd" Mar 13 14:24:07 crc kubenswrapper[4898]: I0313 14:24:07.660633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerStarted","Data":"795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d"} Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.242684 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.257144 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556858-vdbkv"] Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.682156 4898 generic.go:334] "Generic (PLEG): container finished" podID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" exitCode=0 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.682373 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" containerID="cri-o://66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca"} Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683143 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" containerID="cri-o://795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683327 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" containerID="cri-o://7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683673 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.683760 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" containerID="cri-o://1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6" gracePeriod=30 Mar 13 14:24:08 crc kubenswrapper[4898]: I0313 14:24:08.745708 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.386722591 podStartE2EDuration="7.745681948s" podCreationTimestamp="2026-03-13 14:24:01 +0000 UTC" firstStartedPulling="2026-03-13 14:24:02.645048969 +0000 UTC m=+1677.646637218" lastFinishedPulling="2026-03-13 14:24:07.004008336 +0000 UTC m=+1682.005596575" observedRunningTime="2026-03-13 14:24:08.731218073 +0000 UTC m=+1683.732806322" watchObservedRunningTime="2026-03-13 14:24:08.745681948 +0000 UTC m=+1683.747270207" Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699189 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d" exitCode=0 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699455 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6" exitCode=2 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699463 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493" exitCode=0 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699257 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699530 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.699544 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.703472 4898 generic.go:334] "Generic (PLEG): container finished" podID="a8db736b-00b7-4251-a667-3b2138c6c928" containerID="9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c" exitCode=0 Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.703516 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerDied","Data":"9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c"} Mar 13 14:24:09 crc kubenswrapper[4898]: I0313 14:24:09.758656 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0610af-1f13-4f43-9249-8d50a0dcbc14" path="/var/lib/kubelet/pods/1b0610af-1f13-4f43-9249-8d50a0dcbc14/volumes" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.723493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerStarted","Data":"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1"} Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.728877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a8db736b-00b7-4251-a667-3b2138c6c928","Type":"ContainerDied","Data":"2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca"} Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.728968 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f2f835653ef4ac86c3f5f419dc3f2bdfd6ae25a7d99a1468342cb1b296536ca" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.757198 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-plhgx" podStartSLOduration=3.610090929 podStartE2EDuration="9.757148084s" podCreationTimestamp="2026-03-13 14:24:01 +0000 UTC" firstStartedPulling="2026-03-13 14:24:03.561206532 +0000 UTC m=+1678.562794771" lastFinishedPulling="2026-03-13 14:24:09.708263687 +0000 UTC m=+1684.709851926" observedRunningTime="2026-03-13 14:24:10.74709579 +0000 UTC m=+1685.748684049" watchObservedRunningTime="2026-03-13 14:24:10.757148084 +0000 UTC m=+1685.758736333" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.776967 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832510 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832608 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832715 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.832876 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") pod \"a8db736b-00b7-4251-a667-3b2138c6c928\" (UID: \"a8db736b-00b7-4251-a667-3b2138c6c928\") " Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.838693 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs" (OuterVolumeSpecName: "logs") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.846759 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5" (OuterVolumeSpecName: "kube-api-access-tg4w5") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "kube-api-access-tg4w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.884708 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data" (OuterVolumeSpecName: "config-data") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.887428 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.917083 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a8db736b-00b7-4251-a667-3b2138c6c928" (UID: "a8db736b-00b7-4251-a667-3b2138c6c928"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936113 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg4w5\" (UniqueName: \"kubernetes.io/projected/a8db736b-00b7-4251-a667-3b2138c6c928-kube-api-access-tg4w5\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936146 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936156 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936165 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db736b-00b7-4251-a667-3b2138c6c928-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:10 crc kubenswrapper[4898]: I0313 14:24:10.936175 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8db736b-00b7-4251-a667-3b2138c6c928-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.741956 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.745098 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.753764 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerID="66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811" exitCode=0 Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.755694 4898 generic.go:334] "Generic (PLEG): container finished" podID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" exitCode=0 Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.755760 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.756704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811"} Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.756733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerDied","Data":"996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0"} Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.802332 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.802398 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.817983 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.842428 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.857344 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858597 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerName="oc" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858616 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerName="oc" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858637 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858644 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858657 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerName="nova-manage" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858663 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerName="nova-manage" Mar 13 14:24:11 crc kubenswrapper[4898]: E0313 14:24:11.858680 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858687 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858918 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-log" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858943 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" containerName="nova-manage" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858960 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" containerName="nova-metadata-metadata" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.858972 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" containerName="oc" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.860432 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.865298 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.865436 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.887367 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965526 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965583 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-logs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-config-data\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965693 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:11 crc kubenswrapper[4898]: I0313 14:24:11.965816 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dlfg\" (UniqueName: \"kubernetes.io/projected/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-kube-api-access-7dlfg\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dlfg\" (UniqueName: \"kubernetes.io/projected/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-kube-api-access-7dlfg\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068347 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-logs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-config-data\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.068484 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.069198 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-logs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.076384 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.078963 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-config-data\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.079622 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.083365 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dlfg\" (UniqueName: \"kubernetes.io/projected/8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17-kube-api-access-7dlfg\") pod \"nova-metadata-0\" (UID: \"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17\") " pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.183856 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.195876 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.202300 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.283805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") pod \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.283879 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") pod \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.283956 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284070 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284090 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284174 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284223 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") pod \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\" (UID: \"9cfb3db3-7d46-4ab5-aecc-00ddd738d359\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284298 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.284319 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") pod \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\" (UID: \"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7\") " Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.288151 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.289520 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts" (OuterVolumeSpecName: "scripts") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.298532 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq" (OuterVolumeSpecName: "kube-api-access-26jcq") pod "9cfb3db3-7d46-4ab5-aecc-00ddd738d359" (UID: "9cfb3db3-7d46-4ab5-aecc-00ddd738d359"). InnerVolumeSpecName "kube-api-access-26jcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.298829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.308124 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv" (OuterVolumeSpecName: "kube-api-access-r8cbv") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "kube-api-access-r8cbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.327410 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.327718 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cfb3db3-7d46-4ab5-aecc-00ddd738d359" (UID: "9cfb3db3-7d46-4ab5-aecc-00ddd738d359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.362723 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data" (OuterVolumeSpecName: "config-data") pod "9cfb3db3-7d46-4ab5-aecc-00ddd738d359" (UID: "9cfb3db3-7d46-4ab5-aecc-00ddd738d359"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389353 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389379 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26jcq\" (UniqueName: \"kubernetes.io/projected/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-kube-api-access-26jcq\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389392 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389400 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389409 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389416 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389424 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cfb3db3-7d46-4ab5-aecc-00ddd738d359-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.389432 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8cbv\" (UniqueName: \"kubernetes.io/projected/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-kube-api-access-r8cbv\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.424746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.460737 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data" (OuterVolumeSpecName: "config-data") pod "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" (UID: "6ef06426-d2da-4ad2-8168-1ca91c9ca2a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.491876 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.492191 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.739868 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.790698 4898 generic.go:334] "Generic (PLEG): container finished" podID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerID="7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4" exitCode=0 Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.790763 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerDied","Data":"7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.795806 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.795937 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6ef06426-d2da-4ad2-8168-1ca91c9ca2a7","Type":"ContainerDied","Data":"14d2db4ac83fec75f96ff83117a0a7f835b4698b6e0a2c106a080e228485900f"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.796007 4898 scope.go:117] "RemoveContainer" containerID="795e7834e563381b54df39b3e37495b74bd94fd75981de1568268bd084ae046d" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.802445 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.802481 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9cfb3db3-7d46-4ab5-aecc-00ddd738d359","Type":"ContainerDied","Data":"4fb43fc1071513c1a034ec9b0dda28c7b02d6ddc884f93f9664159fa9c0ff74c"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.807224 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17","Type":"ContainerStarted","Data":"6811861f93d3b2a6d535376e944114be9a28d87de96ba9faef32c89bda3c7c57"} Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.827557 4898 scope.go:117] "RemoveContainer" containerID="1fd92eb6403fb0dbf337c5f7f0d057b61a6f0d6c15c82bc4970d631ddca294d6" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.893197 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.907102 4898 scope.go:117] "RemoveContainer" containerID="7cfa2af5e418e74781b5abeafa5d792bceacbddbc2bc25e1e25fd94e21acc493" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.916844 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.934759 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.944009 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-plhgx" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" probeResult="failure" output=< Mar 13 14:24:12 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:24:12 crc kubenswrapper[4898]: > Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.948386 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.968952 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969445 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969465 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969475 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969482 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969515 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969521 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969549 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969556 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: E0313 14:24:12.969568 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969574 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969806 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-central-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969824 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="proxy-httpd" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969846 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="ceilometer-notification-agent" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969859 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" containerName="nova-scheduler-scheduler" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.969867 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" containerName="sg-core" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.977060 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.980161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.980458 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.983187 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.985532 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.987161 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 14:24:12 crc kubenswrapper[4898]: I0313 14:24:12.991480 4898 scope.go:117] "RemoveContainer" containerID="66438918ebccde6ac554af32fd8660905c8d96c674c33aa3ef4ebeb984883811" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006504 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006569 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006609 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006656 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006710 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.006824 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.009619 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.015705 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.015758 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.036807 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112845 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112882 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-config-data\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.112984 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113007 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9whj8\" (UniqueName: \"kubernetes.io/projected/97d388e1-b1b3-409d-b7c5-38b37734a8e6-kube-api-access-9whj8\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113153 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.113791 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.114329 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.118655 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.119357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.123086 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.124675 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.130560 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"ceilometer-0\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.147394 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.159663 4898 scope.go:117] "RemoveContainer" containerID="996fc46edbed8602f9eda3a09dc63ac36038779496a63c3ff3ba77a3b3a9e5b0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214110 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214334 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214372 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214533 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214794 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214816 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") pod \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\" (UID: \"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee\") " Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.214852 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs" (OuterVolumeSpecName: "logs") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.215768 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.216088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-config-data\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.216195 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9whj8\" (UniqueName: \"kubernetes.io/projected/97d388e1-b1b3-409d-b7c5-38b37734a8e6-kube-api-access-9whj8\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.216328 4898 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-logs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.217434 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw" (OuterVolumeSpecName: "kube-api-access-thsgw") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "kube-api-access-thsgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.222190 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.222519 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97d388e1-b1b3-409d-b7c5-38b37734a8e6-config-data\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.235910 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9whj8\" (UniqueName: \"kubernetes.io/projected/97d388e1-b1b3-409d-b7c5-38b37734a8e6-kube-api-access-9whj8\") pod \"nova-scheduler-0\" (UID: \"97d388e1-b1b3-409d-b7c5-38b37734a8e6\") " pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.315555 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data" (OuterVolumeSpecName: "config-data") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.319472 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.319503 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thsgw\" (UniqueName: \"kubernetes.io/projected/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-kube-api-access-thsgw\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.371240 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.397881 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.408292 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" (UID: "9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.421180 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.421209 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.421220 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.426755 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.438328 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.755632 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ef06426-d2da-4ad2-8168-1ca91c9ca2a7" path="/var/lib/kubelet/pods/6ef06426-d2da-4ad2-8168-1ca91c9ca2a7/volumes" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.757174 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cfb3db3-7d46-4ab5-aecc-00ddd738d359" path="/var/lib/kubelet/pods/9cfb3db3-7d46-4ab5-aecc-00ddd738d359/volumes" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.757836 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8db736b-00b7-4251-a667-3b2138c6c928" path="/var/lib/kubelet/pods/a8db736b-00b7-4251-a667-3b2138c6c928/volumes" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.859714 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17","Type":"ContainerStarted","Data":"af208a37c486aa4aae0677032ff0e628bf9613329753d31ce19db678648487fb"} Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.859785 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17","Type":"ContainerStarted","Data":"97673219690d9afe13f9d5f67d427b244d491868266d4831070e61c4d58caaf1"} Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.864267 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.865017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee","Type":"ContainerDied","Data":"e928948e8aff5ca3e9c4f8ba788b16341b26756ed4665c742a244420ba53b0dc"} Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.865447 4898 scope.go:117] "RemoveContainer" containerID="7f380ed1e03850f9a5390c93de71ece12783543d722324b0cebc7392d71d1cf4" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.897472 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.897453862 podStartE2EDuration="2.897453862s" podCreationTimestamp="2026-03-13 14:24:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:24:13.884556303 +0000 UTC m=+1688.886144562" watchObservedRunningTime="2026-03-13 14:24:13.897453862 +0000 UTC m=+1688.899042111" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.912726 4898 scope.go:117] "RemoveContainer" containerID="61b5d14b4c28a00305c933cc1ba3f69720704610d85741eff661575a32229c94" Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.978048 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:13 crc kubenswrapper[4898]: I0313 14:24:13.991407 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.003572 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: E0313 14:24:14.004158 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004172 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" Mar 13 14:24:14 crc kubenswrapper[4898]: E0313 14:24:14.004187 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004193 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004459 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-log" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.004491 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" containerName="nova-api-api" Mar 13 14:24:14 crc kubenswrapper[4898]: E0313 14:24:14.006282 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b7e79b7_e581_4429_b3d4_9dd7ec5e79ee.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.007817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.011002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.011864 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.012235 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.016216 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.029826 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054428 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-config-data\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054525 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054626 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7dd576-1005-4fdb-95c1-e5da9f04b177-logs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.054836 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ds9\" (UniqueName: \"kubernetes.io/projected/ef7dd576-1005-4fdb-95c1-e5da9f04b177-kube-api-access-w7ds9\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.087202 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.156822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-config-data\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157281 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7dd576-1005-4fdb-95c1-e5da9f04b177-logs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157307 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ds9\" (UniqueName: \"kubernetes.io/projected/ef7dd576-1005-4fdb-95c1-e5da9f04b177-kube-api-access-w7ds9\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.157562 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.158499 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ef7dd576-1005-4fdb-95c1-e5da9f04b177-logs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.161716 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.162771 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.163009 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-public-tls-certs\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.170114 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef7dd576-1005-4fdb-95c1-e5da9f04b177-config-data\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.175298 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ds9\" (UniqueName: \"kubernetes.io/projected/ef7dd576-1005-4fdb-95c1-e5da9f04b177-kube-api-access-w7ds9\") pod \"nova-api-0\" (UID: \"ef7dd576-1005-4fdb-95c1-e5da9f04b177\") " pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.359392 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.860744 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.889631 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.889834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"a79c1dd41b2637198b516933e99a0d19e49eb0f06da00b903e6a8bd5f7dcb1dc"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.893110 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d388e1-b1b3-409d-b7c5-38b37734a8e6","Type":"ContainerStarted","Data":"f1f9f7f0ddc965e542d47521c647bde30cf7c3d61167e79e93c17ed0e017da76"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.893160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"97d388e1-b1b3-409d-b7c5-38b37734a8e6","Type":"ContainerStarted","Data":"bec9e188e91a6b36561be6565cd152626eaf887f736335a6a7ec4277b6b37808"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.902599 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7dd576-1005-4fdb-95c1-e5da9f04b177","Type":"ContainerStarted","Data":"2450a754a8cdfb08dca6764a1e6216668ee4da861f43efd9d3d8821b9ac67477"} Mar 13 14:24:14 crc kubenswrapper[4898]: I0313 14:24:14.907659 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.907636507 podStartE2EDuration="2.907636507s" podCreationTimestamp="2026-03-13 14:24:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:24:14.907095382 +0000 UTC m=+1689.908683621" watchObservedRunningTime="2026-03-13 14:24:14.907636507 +0000 UTC m=+1689.909224746" Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.757459 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee" path="/var/lib/kubelet/pods/9b7e79b7-e581-4429-b3d4-9dd7ec5e79ee/volumes" Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.917366 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7dd576-1005-4fdb-95c1-e5da9f04b177","Type":"ContainerStarted","Data":"5f2534883f510a8fdd5e23cb7dccb85906de337bd6a61fc10f1d9b37d0c03f02"} Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.917409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ef7dd576-1005-4fdb-95c1-e5da9f04b177","Type":"ContainerStarted","Data":"a596a8e0608b567b537c0d150b0e75f3ff075578d78549760a35e8f2f70708a3"} Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.929476 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77"} Mar 13 14:24:15 crc kubenswrapper[4898]: I0313 14:24:15.956085 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.956045875 podStartE2EDuration="2.956045875s" podCreationTimestamp="2026-03-13 14:24:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:24:15.945703773 +0000 UTC m=+1690.947292022" watchObservedRunningTime="2026-03-13 14:24:15.956045875 +0000 UTC m=+1690.957634114" Mar 13 14:24:16 crc kubenswrapper[4898]: I0313 14:24:16.943579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e"} Mar 13 14:24:17 crc kubenswrapper[4898]: I0313 14:24:17.957645 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerStarted","Data":"a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1"} Mar 13 14:24:17 crc kubenswrapper[4898]: I0313 14:24:17.957950 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:24:17 crc kubenswrapper[4898]: I0313 14:24:17.996378 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.343281801 podStartE2EDuration="5.996358618s" podCreationTimestamp="2026-03-13 14:24:12 +0000 UTC" firstStartedPulling="2026-03-13 14:24:13.968445638 +0000 UTC m=+1688.970033877" lastFinishedPulling="2026-03-13 14:24:17.621522455 +0000 UTC m=+1692.623110694" observedRunningTime="2026-03-13 14:24:17.978263253 +0000 UTC m=+1692.979851502" watchObservedRunningTime="2026-03-13 14:24:17.996358618 +0000 UTC m=+1692.997946857" Mar 13 14:24:18 crc kubenswrapper[4898]: I0313 14:24:18.439878 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 14:24:21 crc kubenswrapper[4898]: I0313 14:24:21.897456 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:21 crc kubenswrapper[4898]: I0313 14:24:21.966015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:22 crc kubenswrapper[4898]: I0313 14:24:22.148081 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:22 crc kubenswrapper[4898]: I0313 14:24:22.185445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:24:22 crc kubenswrapper[4898]: I0313 14:24:22.185519 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.018930 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-plhgx" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" containerID="cri-o://201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" gracePeriod=2 Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.201061 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.201092 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.18:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.440149 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.470865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.655770 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.720279 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") pod \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.720446 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") pod \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.720650 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") pod \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\" (UID: \"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8\") " Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.726102 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities" (OuterVolumeSpecName: "utilities") pod "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" (UID: "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.728644 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l" (OuterVolumeSpecName: "kube-api-access-6945l") pod "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" (UID: "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8"). InnerVolumeSpecName "kube-api-access-6945l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.743872 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:23 crc kubenswrapper[4898]: E0313 14:24:23.744216 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.781669 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" (UID: "c75348dc-b6ff-43ff-bd9a-d84c91f23ea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.824024 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6945l\" (UniqueName: \"kubernetes.io/projected/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-kube-api-access-6945l\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.824058 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:23 crc kubenswrapper[4898]: I0313 14:24:23.824067 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.038272 4898 generic.go:334] "Generic (PLEG): container finished" podID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" exitCode=0 Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.038381 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-plhgx" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.038409 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1"} Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.047571 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-plhgx" event={"ID":"c75348dc-b6ff-43ff-bd9a-d84c91f23ea8","Type":"ContainerDied","Data":"357efb5d307770e6ac4560f42dea2e3bc32a6f8f90350563a876139972493b91"} Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.047633 4898 scope.go:117] "RemoveContainer" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.095301 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.095300 4898 scope.go:117] "RemoveContainer" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.101679 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.128939 4898 scope.go:117] "RemoveContainer" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.139606 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-plhgx"] Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.194765 4898 scope.go:117] "RemoveContainer" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" Mar 13 14:24:24 crc kubenswrapper[4898]: E0313 14:24:24.195220 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1\": container with ID starting with 201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1 not found: ID does not exist" containerID="201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195256 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1"} err="failed to get container status \"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1\": rpc error: code = NotFound desc = could not find container \"201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1\": container with ID starting with 201337bedc1dace7fd7574e59c5253e0ff92dacc7211debdde1f18e42c9daac1 not found: ID does not exist" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195283 4898 scope.go:117] "RemoveContainer" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" Mar 13 14:24:24 crc kubenswrapper[4898]: E0313 14:24:24.195591 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca\": container with ID starting with b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca not found: ID does not exist" containerID="b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195612 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca"} err="failed to get container status \"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca\": rpc error: code = NotFound desc = could not find container \"b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca\": container with ID starting with b4e6b7a97336f9db7d45fac3ccbf9ee16b3f41f5a6729150379834717d5666ca not found: ID does not exist" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195631 4898 scope.go:117] "RemoveContainer" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" Mar 13 14:24:24 crc kubenswrapper[4898]: E0313 14:24:24.195876 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c\": container with ID starting with 327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c not found: ID does not exist" containerID="327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.195911 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c"} err="failed to get container status \"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c\": rpc error: code = NotFound desc = could not find container \"327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c\": container with ID starting with 327e55d62108c057fe7c12b2ee047d6a3b7e188b390c1e80ec12227125ce316c not found: ID does not exist" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.360296 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:24 crc kubenswrapper[4898]: I0313 14:24:24.360361 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 14:24:25 crc kubenswrapper[4898]: I0313 14:24:25.372052 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef7dd576-1005-4fdb-95c1-e5da9f04b177" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.21:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:25 crc kubenswrapper[4898]: I0313 14:24:25.372078 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ef7dd576-1005-4fdb-95c1-e5da9f04b177" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.21:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 14:24:25 crc kubenswrapper[4898]: I0313 14:24:25.758356 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" path="/var/lib/kubelet/pods/c75348dc-b6ff-43ff-bd9a-d84c91f23ea8/volumes" Mar 13 14:24:30 crc kubenswrapper[4898]: I0313 14:24:30.187996 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:24:30 crc kubenswrapper[4898]: I0313 14:24:30.188617 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.192372 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.194616 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.200838 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.201882 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.360515 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:32 crc kubenswrapper[4898]: I0313 14:24:32.360575 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.369630 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.372702 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.381315 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:24:34 crc kubenswrapper[4898]: I0313 14:24:34.442929 4898 scope.go:117] "RemoveContainer" containerID="c8c6599b57d68b7830c9784f9ac2322559fa7b500359ca53fa26e39b23292ec4" Mar 13 14:24:35 crc kubenswrapper[4898]: I0313 14:24:35.208677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.214692 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerID="e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb" exitCode=137 Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.214757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb"} Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.739950 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:36 crc kubenswrapper[4898]: E0313 14:24:36.740627 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.810002 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878352 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878644 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.878752 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") pod \"3a036241-2013-494e-8c1f-7584e9af2bf4\" (UID: \"3a036241-2013-494e-8c1f-7584e9af2bf4\") " Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.889283 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts" (OuterVolumeSpecName: "scripts") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.902156 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd" (OuterVolumeSpecName: "kube-api-access-sw6dd") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "kube-api-access-sw6dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.982637 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw6dd\" (UniqueName: \"kubernetes.io/projected/3a036241-2013-494e-8c1f-7584e9af2bf4-kube-api-access-sw6dd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:36 crc kubenswrapper[4898]: I0313 14:24:36.982678 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.048023 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data" (OuterVolumeSpecName: "config-data") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.086147 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a036241-2013-494e-8c1f-7584e9af2bf4" (UID: "3a036241-2013-494e-8c1f-7584e9af2bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.087138 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.087163 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a036241-2013-494e-8c1f-7584e9af2bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.230077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"3a036241-2013-494e-8c1f-7584e9af2bf4","Type":"ContainerDied","Data":"ded3b65c989e6a6e858ee713ff395a11604658cb153b63189d68172abd0b0293"} Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.230122 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.230133 4898 scope.go:117] "RemoveContainer" containerID="e2cbd875c5a5e2c192552e4d11fe9c7d29be7ff5b49fb690e14597f7135e2adb" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.262125 4898 scope.go:117] "RemoveContainer" containerID="e1faed09b83d1750f543cd522c4a6bcbb14737623c5d254b367266c190bdbe2c" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.310735 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.324880 4898 scope.go:117] "RemoveContainer" containerID="9263a833a783d5b9000e728c8a69913160176a9e9c946c16d7fa08425ffcc556" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.325089 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339087 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339675 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339697 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339737 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-utilities" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339747 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-utilities" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339767 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-content" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339787 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="extract-content" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339819 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339827 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339839 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339846 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" Mar 13 14:24:37 crc kubenswrapper[4898]: E0313 14:24:37.339866 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.339874 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340168 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-api" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340182 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-listener" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340215 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c75348dc-b6ff-43ff-bd9a-d84c91f23ea8" containerName="registry-server" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340234 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-notifier" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.340247 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" containerName="aodh-evaluator" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.345620 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.347814 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.347862 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.348010 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.348043 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.347821 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.351085 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.358869 4898 scope.go:117] "RemoveContainer" containerID="1e2c142eba973e7412047a391872c9d25e5735c5b05576796ed64cb74c786bb5" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394636 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394698 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.394851 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.496833 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.496955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497006 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497087 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.497174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.510504 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.511019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.511431 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.512875 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.515343 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.518723 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"aodh-0\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.673302 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:24:37 crc kubenswrapper[4898]: I0313 14:24:37.759508 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a036241-2013-494e-8c1f-7584e9af2bf4" path="/var/lib/kubelet/pods/3a036241-2013-494e-8c1f-7584e9af2bf4/volumes" Mar 13 14:24:38 crc kubenswrapper[4898]: W0313 14:24:38.205544 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88246540_ca61_4fb0_8934_c8ebb4559860.slice/crio-4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07 WatchSource:0}: Error finding container 4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07: Status 404 returned error can't find the container with id 4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07 Mar 13 14:24:38 crc kubenswrapper[4898]: I0313 14:24:38.208129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:24:38 crc kubenswrapper[4898]: I0313 14:24:38.278840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07"} Mar 13 14:24:40 crc kubenswrapper[4898]: I0313 14:24:40.300404 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860"} Mar 13 14:24:42 crc kubenswrapper[4898]: I0313 14:24:42.335834 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e"} Mar 13 14:24:43 crc kubenswrapper[4898]: I0313 14:24:43.348716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75"} Mar 13 14:24:43 crc kubenswrapper[4898]: I0313 14:24:43.436247 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 14:24:44 crc kubenswrapper[4898]: I0313 14:24:44.361232 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerStarted","Data":"f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831"} Mar 13 14:24:44 crc kubenswrapper[4898]: I0313 14:24:44.399754 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.823534532 podStartE2EDuration="7.399733662s" podCreationTimestamp="2026-03-13 14:24:37 +0000 UTC" firstStartedPulling="2026-03-13 14:24:38.207799518 +0000 UTC m=+1713.209387757" lastFinishedPulling="2026-03-13 14:24:43.783998648 +0000 UTC m=+1718.785586887" observedRunningTime="2026-03-13 14:24:44.382937811 +0000 UTC m=+1719.384526070" watchObservedRunningTime="2026-03-13 14:24:44.399733662 +0000 UTC m=+1719.401321901" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.129863 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.130574 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" containerID="cri-o://71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682" gracePeriod=30 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.231830 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.232353 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" containerID="cri-o://169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00" gracePeriod=30 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.434482 4898 generic.go:334] "Generic (PLEG): container finished" podID="4e010381-921d-4328-9027-ddb9a54a08bd" containerID="71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682" exitCode=2 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.434568 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerDied","Data":"71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682"} Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.436526 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerID="169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00" exitCode=2 Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.436575 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerDied","Data":"169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00"} Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.659576 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.818980 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") pod \"4e010381-921d-4328-9027-ddb9a54a08bd\" (UID: \"4e010381-921d-4328-9027-ddb9a54a08bd\") " Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.828183 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m" (OuterVolumeSpecName: "kube-api-access-5p58m") pod "4e010381-921d-4328-9027-ddb9a54a08bd" (UID: "4e010381-921d-4328-9027-ddb9a54a08bd"). InnerVolumeSpecName "kube-api-access-5p58m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.855937 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:48 crc kubenswrapper[4898]: I0313 14:24:48.923963 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p58m\" (UniqueName: \"kubernetes.io/projected/4e010381-921d-4328-9027-ddb9a54a08bd-kube-api-access-5p58m\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.025147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") pod \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.025274 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") pod \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.025309 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") pod \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\" (UID: \"8c27f029-bffd-4f8f-bb24-c1c9c245d38c\") " Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.029241 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h" (OuterVolumeSpecName: "kube-api-access-8775h") pod "8c27f029-bffd-4f8f-bb24-c1c9c245d38c" (UID: "8c27f029-bffd-4f8f-bb24-c1c9c245d38c"). InnerVolumeSpecName "kube-api-access-8775h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.078421 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c27f029-bffd-4f8f-bb24-c1c9c245d38c" (UID: "8c27f029-bffd-4f8f-bb24-c1c9c245d38c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.082204 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data" (OuterVolumeSpecName: "config-data") pod "8c27f029-bffd-4f8f-bb24-c1c9c245d38c" (UID: "8c27f029-bffd-4f8f-bb24-c1c9c245d38c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.129053 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.129105 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.129126 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8775h\" (UniqueName: \"kubernetes.io/projected/8c27f029-bffd-4f8f-bb24-c1c9c245d38c-kube-api-access-8775h\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.451950 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4e010381-921d-4328-9027-ddb9a54a08bd","Type":"ContainerDied","Data":"a514287f4abd02abdb35f5efc576784d286f154281545d6e3b18397fdacfa325"} Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.452002 4898 scope.go:117] "RemoveContainer" containerID="71740b094889fce6ef9ef07ec41cbfdf46a3a1807d9a456b2458bac02fa10682" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.452052 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.456154 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8c27f029-bffd-4f8f-bb24-c1c9c245d38c","Type":"ContainerDied","Data":"fe553b08f29dc87c01c836389227794f6bc900596f5a85dd1ed792d64aa19876"} Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.456198 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.491682 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.509646 4898 scope.go:117] "RemoveContainer" containerID="169bac7a2a87f00863ce79b3524b65b62b7c3fd09f57a0d5d216a587ead2fc00" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.538984 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.563950 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: E0313 14:24:49.564422 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.564441 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" Mar 13 14:24:49 crc kubenswrapper[4898]: E0313 14:24:49.564466 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.564473 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.564717 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" containerName="mysqld-exporter" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.565570 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" containerName="kube-state-metrics" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.566536 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.570483 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.573713 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.598870 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.623821 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.640805 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.653317 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.655384 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.658546 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.658973 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664675 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6w4q\" (UniqueName: \"kubernetes.io/projected/8a198c14-e13f-4858-87c4-de6be0fa8d0c-kube-api-access-b6w4q\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664884 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.664953 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.665147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcjgw\" (UniqueName: \"kubernetes.io/projected/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-api-access-zcjgw\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.665251 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.665275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.670225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.751208 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e010381-921d-4328-9027-ddb9a54a08bd" path="/var/lib/kubelet/pods/4e010381-921d-4328-9027-ddb9a54a08bd/volumes" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.751779 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c27f029-bffd-4f8f-bb24-c1c9c245d38c" path="/var/lib/kubelet/pods/8c27f029-bffd-4f8f-bb24-c1c9c245d38c/volumes" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767663 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767702 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6w4q\" (UniqueName: \"kubernetes.io/projected/8a198c14-e13f-4858-87c4-de6be0fa8d0c-kube-api-access-b6w4q\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767773 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767808 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767876 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcjgw\" (UniqueName: \"kubernetes.io/projected/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-api-access-zcjgw\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767949 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.767980 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773303 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773393 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-config-data\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773521 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.773588 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a198c14-e13f-4858-87c4-de6be0fa8d0c-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.774124 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.785821 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcjgw\" (UniqueName: \"kubernetes.io/projected/b7452a36-0169-4cfe-9ede-ef4d0ef072d9-kube-api-access-zcjgw\") pod \"kube-state-metrics-0\" (UID: \"b7452a36-0169-4cfe-9ede-ef4d0ef072d9\") " pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.787078 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6w4q\" (UniqueName: \"kubernetes.io/projected/8a198c14-e13f-4858-87c4-de6be0fa8d0c-kube-api-access-b6w4q\") pod \"mysqld-exporter-0\" (UID: \"8a198c14-e13f-4858-87c4-de6be0fa8d0c\") " pod="openstack/mysqld-exporter-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.911399 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 14:24:49 crc kubenswrapper[4898]: I0313 14:24:49.976368 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.260868 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.261463 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" containerID="cri-o://f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.262040 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" containerID="cri-o://a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.262075 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" containerID="cri-o://72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.262059 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" containerID="cri-o://f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e" gracePeriod=30 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.444798 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.577830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7452a36-0169-4cfe-9ede-ef4d0ef072d9","Type":"ContainerStarted","Data":"55f7eb4777f7efaa80315786f8c6ff46779e1a725ad27fb01a0326943ceba0cb"} Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.632284 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660594 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1" exitCode=0 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660641 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e" exitCode=2 Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660663 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1"} Mar 13 14:24:50 crc kubenswrapper[4898]: I0313 14:24:50.660688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.679414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8a198c14-e13f-4858-87c4-de6be0fa8d0c","Type":"ContainerStarted","Data":"27a74493bbe48a66536d1e7cb863f2ab01d0775f8744b183e0e5dbb10c8028c3"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.679980 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"8a198c14-e13f-4858-87c4-de6be0fa8d0c","Type":"ContainerStarted","Data":"e5cbcba754d8ed9be93f24af64837c85b3284b2874ba52299b17de05cb9bfbbf"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.705719 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce" exitCode=0 Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.705773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce"} Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.712820 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.165032711 podStartE2EDuration="2.712793288s" podCreationTimestamp="2026-03-13 14:24:49 +0000 UTC" firstStartedPulling="2026-03-13 14:24:50.658528516 +0000 UTC m=+1725.660116755" lastFinishedPulling="2026-03-13 14:24:51.206289093 +0000 UTC m=+1726.207877332" observedRunningTime="2026-03-13 14:24:51.700195057 +0000 UTC m=+1726.701783306" watchObservedRunningTime="2026-03-13 14:24:51.712793288 +0000 UTC m=+1726.714381557" Mar 13 14:24:51 crc kubenswrapper[4898]: I0313 14:24:51.742660 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:24:51 crc kubenswrapper[4898]: E0313 14:24:51.742990 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.721107 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b7452a36-0169-4cfe-9ede-ef4d0ef072d9","Type":"ContainerStarted","Data":"b4de4756fb05eb0a0367dd16335fa2c88e7c0f23e270a7655f16e0624156257d"} Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.721530 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729413 4898 generic.go:334] "Generic (PLEG): container finished" podID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerID="72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77" exitCode=0 Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729680 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77"} Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d56db73-0e9e-47af-b0bd-77231fe40077","Type":"ContainerDied","Data":"a79c1dd41b2637198b516933e99a0d19e49eb0f06da00b903e6a8bd5f7dcb1dc"} Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.729738 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a79c1dd41b2637198b516933e99a0d19e49eb0f06da00b903e6a8bd5f7dcb1dc" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.771475 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.789728 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.435414337 podStartE2EDuration="3.789704836s" podCreationTimestamp="2026-03-13 14:24:49 +0000 UTC" firstStartedPulling="2026-03-13 14:24:50.496189958 +0000 UTC m=+1725.497778197" lastFinishedPulling="2026-03-13 14:24:51.850480457 +0000 UTC m=+1726.852068696" observedRunningTime="2026-03-13 14:24:52.741368456 +0000 UTC m=+1727.742956695" watchObservedRunningTime="2026-03-13 14:24:52.789704836 +0000 UTC m=+1727.791293095" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.903881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904087 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904156 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904233 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904386 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904449 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904568 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.904851 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.909758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts" (OuterVolumeSpecName: "scripts") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.912236 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.912263 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d56db73-0e9e-47af-b0bd-77231fe40077-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.912272 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.918289 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt" (OuterVolumeSpecName: "kube-api-access-tk9pt") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "kube-api-access-tk9pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:24:52 crc kubenswrapper[4898]: I0313 14:24:52.942892 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.013229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014002 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") pod \"0d56db73-0e9e-47af-b0bd-77231fe40077\" (UID: \"0d56db73-0e9e-47af-b0bd-77231fe40077\") " Mar 13 14:24:53 crc kubenswrapper[4898]: W0313 14:24:53.014135 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/0d56db73-0e9e-47af-b0bd-77231fe40077/volumes/kubernetes.io~secret/combined-ca-bundle Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014152 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014706 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014726 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk9pt\" (UniqueName: \"kubernetes.io/projected/0d56db73-0e9e-47af-b0bd-77231fe40077-kube-api-access-tk9pt\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.014739 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.032076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data" (OuterVolumeSpecName: "config-data") pod "0d56db73-0e9e-47af-b0bd-77231fe40077" (UID: "0d56db73-0e9e-47af-b0bd-77231fe40077"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.117072 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d56db73-0e9e-47af-b0bd-77231fe40077-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.739164 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.775829 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.787695 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.804463 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805281 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805381 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805460 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805512 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805612 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805670 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" Mar 13 14:24:53 crc kubenswrapper[4898]: E0313 14:24:53.805739 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.805800 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806093 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="proxy-httpd" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806213 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-notification-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806290 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="ceilometer-central-agent" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.806351 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" containerName="sg-core" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.808630 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.813164 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.813549 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.813873 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.815334 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.936602 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937089 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.937817 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.938053 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.938262 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:53 crc kubenswrapper[4898]: I0313 14:24:53.938451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041206 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041248 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041271 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041415 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041478 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041530 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.041620 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.042103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.042433 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.048537 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.048690 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.048563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.055637 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.057832 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.059211 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"ceilometer-0\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.134192 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.694435 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:24:54 crc kubenswrapper[4898]: I0313 14:24:54.754612 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"efc08f6e1f1c1bae444792fe7fa9bd5076b4f986e80e97128cc5f1ec8235c524"} Mar 13 14:24:55 crc kubenswrapper[4898]: I0313 14:24:55.763321 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d56db73-0e9e-47af-b0bd-77231fe40077" path="/var/lib/kubelet/pods/0d56db73-0e9e-47af-b0bd-77231fe40077/volumes" Mar 13 14:24:55 crc kubenswrapper[4898]: I0313 14:24:55.771688 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3"} Mar 13 14:24:56 crc kubenswrapper[4898]: I0313 14:24:56.785689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279"} Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.413484 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.424429 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-zgt75"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.492518 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.494119 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.512217 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.626803 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.626875 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.626920 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.729039 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.729145 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.729192 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.735339 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.759051 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84a7fd24-4320-4c0e-8ded-0d455252a549" path="/var/lib/kubelet/pods/84a7fd24-4320-4c0e-8ded-0d455252a549/volumes" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.759539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.761745 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"heat-db-sync-kxtcf\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.804528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb"} Mar 13 14:24:57 crc kubenswrapper[4898]: I0313 14:24:57.814795 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:24:58 crc kubenswrapper[4898]: I0313 14:24:58.372286 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:24:58 crc kubenswrapper[4898]: I0313 14:24:58.815817 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerStarted","Data":"f0ef8052f16886ece221ecf56528cf884da231c4fa187db604454c6c5925f956"} Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.513441 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.843205 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerStarted","Data":"d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2"} Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.843459 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.874030 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.855993989 podStartE2EDuration="6.874011189s" podCreationTimestamp="2026-03-13 14:24:53 +0000 UTC" firstStartedPulling="2026-03-13 14:24:54.69045719 +0000 UTC m=+1729.692045439" lastFinishedPulling="2026-03-13 14:24:58.7084744 +0000 UTC m=+1733.710062639" observedRunningTime="2026-03-13 14:24:59.869263114 +0000 UTC m=+1734.870851363" watchObservedRunningTime="2026-03-13 14:24:59.874011189 +0000 UTC m=+1734.875599428" Mar 13 14:24:59 crc kubenswrapper[4898]: I0313 14:24:59.924108 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 14:25:00 crc kubenswrapper[4898]: I0313 14:25:00.834984 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.037053 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.869773 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" containerID="cri-o://dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3" gracePeriod=30 Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.870249 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" containerID="cri-o://d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2" gracePeriod=30 Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.870405 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" containerID="cri-o://c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb" gracePeriod=30 Mar 13 14:25:01 crc kubenswrapper[4898]: I0313 14:25:01.870456 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" containerID="cri-o://c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279" gracePeriod=30 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.918285 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2" exitCode=0 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919728 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb" exitCode=2 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919744 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279" exitCode=0 Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2"} Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919801 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb"} Mar 13 14:25:02 crc kubenswrapper[4898]: I0313 14:25:02.919817 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279"} Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.690111 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" containerID="cri-o://fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b" gracePeriod=604795 Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953406 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerID="dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3" exitCode=0 Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953482 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3"} Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953781 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b0ac06d2-e2ea-4b4a-8201-83494b53b968","Type":"ContainerDied","Data":"efc08f6e1f1c1bae444792fe7fa9bd5076b4f986e80e97128cc5f1ec8235c524"} Mar 13 14:25:04 crc kubenswrapper[4898]: I0313 14:25:04.953797 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc08f6e1f1c1bae444792fe7fa9bd5076b4f986e80e97128cc5f1ec8235c524" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.031754 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164422 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164562 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164734 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164782 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.164823 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") pod \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\" (UID: \"b0ac06d2-e2ea-4b4a-8201-83494b53b968\") " Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.171805 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.172105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.194022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts" (OuterVolumeSpecName: "scripts") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.195440 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p" (OuterVolumeSpecName: "kube-api-access-xlm6p") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "kube-api-access-xlm6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267704 4898 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267731 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlm6p\" (UniqueName: \"kubernetes.io/projected/b0ac06d2-e2ea-4b4a-8201-83494b53b968-kube-api-access-xlm6p\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267741 4898 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b0ac06d2-e2ea-4b4a-8201-83494b53b968-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.267749 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.283417 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.284453 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.305190 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.352993 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data" (OuterVolumeSpecName: "config-data") pod "b0ac06d2-e2ea-4b4a-8201-83494b53b968" (UID: "b0ac06d2-e2ea-4b4a-8201-83494b53b968"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370124 4898 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370175 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370184 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.370194 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0ac06d2-e2ea-4b4a-8201-83494b53b968-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.586198 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" containerID="cri-o://d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378" gracePeriod=604796 Mar 13 14:25:05 crc kubenswrapper[4898]: I0313 14:25:05.971305 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.022704 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.084336 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117312 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117874 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117892 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117943 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117950 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117968 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.117974 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.117995 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118002 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118216 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="proxy-httpd" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118235 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-central-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118263 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="sg-core" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.118277 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" containerName="ceilometer-notification-agent" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.120744 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.126793 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.127098 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.127412 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.134067 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.200721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201235 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201391 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201465 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w25qw\" (UniqueName: \"kubernetes.io/projected/02f7d483-aecb-4a39-babc-6d9598090c4b-kube-api-access-w25qw\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201566 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201675 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-scripts\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201755 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-config-data\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.201880 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304597 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304680 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w25qw\" (UniqueName: \"kubernetes.io/projected/02f7d483-aecb-4a39-babc-6d9598090c4b-kube-api-access-w25qw\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304747 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304764 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-scripts\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304787 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-config-data\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304814 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.304867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.305338 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-run-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.310052 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.313368 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.316137 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-scripts\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.316395 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/02f7d483-aecb-4a39-babc-6d9598090c4b-log-httpd\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.318774 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.337475 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w25qw\" (UniqueName: \"kubernetes.io/projected/02f7d483-aecb-4a39-babc-6d9598090c4b-kube-api-access-w25qw\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.339623 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f7d483-aecb-4a39-babc-6d9598090c4b-config-data\") pod \"ceilometer-0\" (UID: \"02f7d483-aecb-4a39-babc-6d9598090c4b\") " pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.449223 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.739468 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:06 crc kubenswrapper[4898]: E0313 14:25:06.740155 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:06 crc kubenswrapper[4898]: I0313 14:25:06.892050 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 13 14:25:07 crc kubenswrapper[4898]: I0313 14:25:07.009736 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 14:25:07 crc kubenswrapper[4898]: I0313 14:25:07.212469 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 13 14:25:07 crc kubenswrapper[4898]: I0313 14:25:07.753764 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ac06d2-e2ea-4b4a-8201-83494b53b968" path="/var/lib/kubelet/pods/b0ac06d2-e2ea-4b4a-8201-83494b53b968/volumes" Mar 13 14:25:08 crc kubenswrapper[4898]: I0313 14:25:08.006657 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"a6232eb82b64b66b366d42e19a0e8f84b5b11c39d40c14d2624d531d50080332"} Mar 13 14:25:11 crc kubenswrapper[4898]: I0313 14:25:11.112201 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerDied","Data":"fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b"} Mar 13 14:25:11 crc kubenswrapper[4898]: I0313 14:25:11.112381 4898 generic.go:334] "Generic (PLEG): container finished" podID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerID="fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b" exitCode=0 Mar 13 14:25:12 crc kubenswrapper[4898]: I0313 14:25:12.128685 4898 generic.go:334] "Generic (PLEG): container finished" podID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerID="d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378" exitCode=0 Mar 13 14:25:12 crc kubenswrapper[4898]: I0313 14:25:12.128734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerDied","Data":"d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378"} Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.958308 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.960874 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.965954 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 14:25:13 crc kubenswrapper[4898]: I0313 14:25:13.975002 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.128850 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129059 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129133 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129185 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.129378 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231669 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231754 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231798 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231818 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231863 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.231972 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.233341 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.236485 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.241648 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.241871 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.242369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.242404 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.258541 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"dnsmasq-dns-7d84b4d45c-r4rvf\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:14 crc kubenswrapper[4898]: I0313 14:25:14.281237 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.502103 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594504 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594656 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594709 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594743 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594776 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594874 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594918 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.594949 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595840 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") pod \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\" (UID: \"ee084354-4d32-4d3c-96a4-1e4e7eef5d85\") " Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.595975 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.600151 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.601471 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info" (OuterVolumeSpecName: "pod-info") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.602413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.606404 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.610402 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.620448 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.623438 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2" (OuterVolumeSpecName: "kube-api-access-kgsn2") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "kube-api-access-kgsn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.698413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf" (OuterVolumeSpecName: "server-conf") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704226 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704253 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704291 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgsn2\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-kube-api-access-kgsn2\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704300 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704308 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704316 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.704323 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.792125 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data" (OuterVolumeSpecName: "config-data") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.811718 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.957166 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:16 crc kubenswrapper[4898]: I0313 14:25:16.962959 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede" (OuterVolumeSpecName: "persistence") pod "ee084354-4d32-4d3c-96a4-1e4e7eef5d85" (UID: "ee084354-4d32-4d3c-96a4-1e4e7eef5d85"). InnerVolumeSpecName "pvc-10c025d7-e381-4716-bf38-98f5cf86aede". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.016121 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ee084354-4d32-4d3c-96a4-1e4e7eef5d85-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.016183 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") on node \"crc\" " Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.057468 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.057602 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-10c025d7-e381-4716-bf38-98f5cf86aede" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede") on node "crc" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.118702 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.243260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"ee084354-4d32-4d3c-96a4-1e4e7eef5d85","Type":"ContainerDied","Data":"f525c8a2018341f2e27494e3687eb7a4563181188f5faaa51225478656a928a1"} Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.243320 4898 scope.go:117] "RemoveContainer" containerID="fdd228971531e06c4cfdc0dd4d0052c10c0646d03035ab33629bba605b7a9d8b" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.243520 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.291038 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.306237 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.324859 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: E0313 14:25:17.325374 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="setup-container" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.325391 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="setup-container" Mar 13 14:25:17 crc kubenswrapper[4898]: E0313 14:25:17.325424 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.325431 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.325657 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" containerName="rabbitmq" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.327225 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.344982 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426621 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426825 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d188301-848c-4cf6-a204-e1110714c1be-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426855 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426898 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d188301-848c-4cf6-a204-e1110714c1be-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426948 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426974 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-config-data\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.426994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.427015 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.427149 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ztgr\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-kube-api-access-9ztgr\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529631 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d188301-848c-4cf6-a204-e1110714c1be-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529694 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-config-data\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529872 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ztgr\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-kube-api-access-9ztgr\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529935 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.529979 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530041 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530079 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d188301-848c-4cf6-a204-e1110714c1be-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530101 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530549 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.530970 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.531311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-config-data\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.531777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.532068 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d188301-848c-4cf6-a204-e1110714c1be-server-conf\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.534022 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.534057 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b5b057b78b5a76d291625b9af6af2e0e662115b1b100b445e2e40d0ac02a65c7/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.534787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.537580 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d188301-848c-4cf6-a204-e1110714c1be-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.545284 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.546051 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d188301-848c-4cf6-a204-e1110714c1be-pod-info\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.549651 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ztgr\" (UniqueName: \"kubernetes.io/projected/8d188301-848c-4cf6-a204-e1110714c1be-kube-api-access-9ztgr\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.601258 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-10c025d7-e381-4716-bf38-98f5cf86aede\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-10c025d7-e381-4716-bf38-98f5cf86aede\") pod \"rabbitmq-server-2\" (UID: \"8d188301-848c-4cf6-a204-e1110714c1be\") " pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.655259 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 13 14:25:17 crc kubenswrapper[4898]: I0313 14:25:17.754764 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee084354-4d32-4d3c-96a4-1e4e7eef5d85" path="/var/lib/kubelet/pods/ee084354-4d32-4d3c-96a4-1e4e7eef5d85/volumes" Mar 13 14:25:21 crc kubenswrapper[4898]: I0313 14:25:21.739404 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:21 crc kubenswrapper[4898]: E0313 14:25:21.740386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:21 crc kubenswrapper[4898]: I0313 14:25:21.892166 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: i/o timeout" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.581342 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.698574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704043 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704243 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.704283 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705111 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705167 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705227 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705267 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705293 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705362 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") pod \"d56bd826-4f42-409d-ae41-9bfc70d1e038\" (UID: \"d56bd826-4f42-409d-ae41-9bfc70d1e038\") " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.705897 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.706198 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.706727 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.706755 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.715076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info" (OuterVolumeSpecName: "pod-info") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.718014 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.726337 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq" (OuterVolumeSpecName: "kube-api-access-xk5mq") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "kube-api-access-xk5mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.726385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.740586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.748169 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d" (OuterVolumeSpecName: "persistence") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.780647 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data" (OuterVolumeSpecName: "config-data") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809058 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk5mq\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-kube-api-access-xk5mq\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809099 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809111 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809146 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") on node \"crc\" " Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809162 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d56bd826-4f42-409d-ae41-9bfc70d1e038-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809176 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d56bd826-4f42-409d-ae41-9bfc70d1e038-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.809189 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.822583 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf" (OuterVolumeSpecName: "server-conf") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.875843 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.875998 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d") on node "crc" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.901512 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d56bd826-4f42-409d-ae41-9bfc70d1e038" (UID: "d56bd826-4f42-409d-ae41-9bfc70d1e038"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.910913 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d56bd826-4f42-409d-ae41-9bfc70d1e038-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.910945 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:23 crc kubenswrapper[4898]: I0313 14:25:23.910959 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d56bd826-4f42-409d-ae41-9bfc70d1e038-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.297035 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.297319 4898 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.297459 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z25rd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-kxtcf_openstack(2cd78a2a-1bb4-461a-92cd-d705080b087a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.298663 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-kxtcf" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.348788 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d56bd826-4f42-409d-ae41-9bfc70d1e038","Type":"ContainerDied","Data":"8cd6b4a73f7f67c36783e2cd3de871dd93389c4f889e74a44de4a7253a7e9a9c"} Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.348843 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.367270 4898 scope.go:117] "RemoveContainer" containerID="319d11416db34d4c2bde21b35bf9b79fc6c55b22cfe14271a9be5dde11f3c078" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.371740 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-kxtcf" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.411064 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.440926 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.459897 4898 scope.go:117] "RemoveContainer" containerID="d377b62f42012aae1789077dde2b4c09f8f770f73f941f01fe11eb21f5b88378" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.469690 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.470595 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="setup-container" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.470750 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="setup-container" Mar 13 14:25:24 crc kubenswrapper[4898]: E0313 14:25:24.470801 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.470809 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.471074 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" containerName="rabbitmq" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.472386 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.475848 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.476731 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477113 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477183 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477121 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477315 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4m6nk" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.477512 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.482756 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636217 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636585 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636645 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636673 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636798 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636828 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636887 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.636951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.637084 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.637114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f42nr\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-kube-api-access-f42nr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.728631 4898 scope.go:117] "RemoveContainer" containerID="cb002d235371a7e7beebe07dd448307d31c6dae66e8fbd1dd6c0c499e634cca9" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741287 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741448 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741490 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f42nr\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-kube-api-access-f42nr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741701 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741792 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741840 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741873 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.741970 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.746494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.747372 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.747509 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3111f327615e010747f22a13f9378eff3b7d96c403da97ea4361402b1c85d196/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.765855 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.766134 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.771103 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.773585 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.783702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.783923 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.785609 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f42nr\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-kube-api-access-f42nr\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.785806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.786045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.823685 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e3ef05d8-9def-4ed4-8424-3de1bece7b2d\") pod \"rabbitmq-cell1-server-0\" (UID: \"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.863312 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:25:24 crc kubenswrapper[4898]: I0313 14:25:24.978838 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.007732 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.376114 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.376493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerStarted","Data":"d6acba7cd4117378d7a97b387783250322356fae61cc48751a89151539d61d29"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.378292 4898 generic.go:334] "Generic (PLEG): container finished" podID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerID="ab7dee171473df88511004b0f9cd06f3de427bbb59dba778bc1dbd3f8e29abeb" exitCode=0 Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.378328 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerDied","Data":"ab7dee171473df88511004b0f9cd06f3de427bbb59dba778bc1dbd3f8e29abeb"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.378367 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerStarted","Data":"156e4c323450d64b57cc91d4cd576fcfcc3344435ba7b3b650ea24a1251763ee"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.384650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3"} Mar 13 14:25:25 crc kubenswrapper[4898]: I0313 14:25:25.759954 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56bd826-4f42-409d-ae41-9bfc70d1e038" path="/var/lib/kubelet/pods/d56bd826-4f42-409d-ae41-9bfc70d1e038/volumes" Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.399582 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerStarted","Data":"e8857c4721e5f8ce1de5ec7a35488e4664a881af5e5f3ad6d2772e453cd83c85"} Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.402025 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerStarted","Data":"0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5"} Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.402229 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.404115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"b4732cf2586a8c63bfd4f4a4eb216ad6c43d632d96ee4b66b2126f4895cf7dd1"} Mar 13 14:25:26 crc kubenswrapper[4898]: I0313 14:25:26.426953 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" podStartSLOduration=13.426926024 podStartE2EDuration="13.426926024s" podCreationTimestamp="2026-03-13 14:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:25:26.420845365 +0000 UTC m=+1761.422433654" watchObservedRunningTime="2026-03-13 14:25:26.426926024 +0000 UTC m=+1761.428514263" Mar 13 14:25:27 crc kubenswrapper[4898]: I0313 14:25:27.416633 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerStarted","Data":"0d8797262833812626f4d3e0e1db3d064a9feac6dd8c4aab149c760269a9a573"} Mar 13 14:25:27 crc kubenswrapper[4898]: I0313 14:25:27.420090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"20a7d66d4eadaac13a3d1530dfffc36cf690cb176befcda52b573d0e1cd9e142"} Mar 13 14:25:27 crc kubenswrapper[4898]: I0313 14:25:27.423470 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerStarted","Data":"6ac994a64cbced8d5ed2ad37e427a3eeb5d4669d67bcb7a943f6946233be58c4"} Mar 13 14:25:29 crc kubenswrapper[4898]: I0313 14:25:29.453285 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"b4437a37a66416890e0b218d39962696089d044b5aa8cf8e7b428a548d5a2914"} Mar 13 14:25:29 crc kubenswrapper[4898]: I0313 14:25:29.454030 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 14:25:29 crc kubenswrapper[4898]: I0313 14:25:29.484822 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.067253541 podStartE2EDuration="23.484791785s" podCreationTimestamp="2026-03-13 14:25:06 +0000 UTC" firstStartedPulling="2026-03-13 14:25:07.017676491 +0000 UTC m=+1742.019264740" lastFinishedPulling="2026-03-13 14:25:28.435214735 +0000 UTC m=+1763.436802984" observedRunningTime="2026-03-13 14:25:29.478071278 +0000 UTC m=+1764.479659607" watchObservedRunningTime="2026-03-13 14:25:29.484791785 +0000 UTC m=+1764.486380064" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.283044 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.377850 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.378115 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" containerID="cri-o://4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b" gracePeriod=10 Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.556323 4898 generic.go:334] "Generic (PLEG): container finished" podID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerID="4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b" exitCode=0 Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.556375 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerDied","Data":"4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b"} Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.611957 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-k4ntr"] Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.614039 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.631625 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-k4ntr"] Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.735811 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjhl4\" (UniqueName: \"kubernetes.io/projected/dd51a575-1651-4891-941f-3e0fe447e81d-kube-api-access-hjhl4\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736116 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-config\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736147 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736239 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736281 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736307 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.736412 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.840861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjhl4\" (UniqueName: \"kubernetes.io/projected/dd51a575-1651-4891-941f-3e0fe447e81d-kube-api-access-hjhl4\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841308 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-config\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841331 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841399 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841439 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841456 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.841524 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.842336 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.843076 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.843265 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.843761 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-config\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.845378 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.845474 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd51a575-1651-4891-941f-3e0fe447e81d-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.874532 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjhl4\" (UniqueName: \"kubernetes.io/projected/dd51a575-1651-4891-941f-3e0fe447e81d-kube-api-access-hjhl4\") pod \"dnsmasq-dns-6f6df4f56c-k4ntr\" (UID: \"dd51a575-1651-4891-941f-3e0fe447e81d\") " pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:34 crc kubenswrapper[4898]: I0313 14:25:34.942594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.299478 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.393990 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394080 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394147 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394272 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394397 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.394461 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.419229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs" (OuterVolumeSpecName: "kube-api-access-zg4fs") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "kube-api-access-zg4fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.495085 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.496358 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") pod \"c1d23b78-5402-47e0-8af6-851fcc71be6b\" (UID: \"c1d23b78-5402-47e0-8af6-851fcc71be6b\") " Mar 13 14:25:35 crc kubenswrapper[4898]: W0313 14:25:35.496518 4898 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c1d23b78-5402-47e0-8af6-851fcc71be6b/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.496552 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.497254 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.497276 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zg4fs\" (UniqueName: \"kubernetes.io/projected/c1d23b78-5402-47e0-8af6-851fcc71be6b-kube-api-access-zg4fs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.497751 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.502324 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.503602 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.519022 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config" (OuterVolumeSpecName: "config") pod "c1d23b78-5402-47e0-8af6-851fcc71be6b" (UID: "c1d23b78-5402-47e0-8af6-851fcc71be6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.568852 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" event={"ID":"c1d23b78-5402-47e0-8af6-851fcc71be6b","Type":"ContainerDied","Data":"0c1b795a0a1b5e26820a1d2e36ef2d08e255211a4eb9dbe6c50123bc0c988977"} Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.568923 4898 scope.go:117] "RemoveContainer" containerID="4e87cc2a9eb5ac3a94e1731b9ead6b0d6cdba2ef55c6b43916f80ca58fe1a32b" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.568966 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-bsswg" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600074 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600107 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600118 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.600131 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c1d23b78-5402-47e0-8af6-851fcc71be6b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.647407 4898 scope.go:117] "RemoveContainer" containerID="7bda3f3e696b51ddd844a86367f80dfd5aee6675d49dc2996a124d70519d4d20" Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.648430 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.660765 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-bsswg"] Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.703019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-k4ntr"] Mar 13 14:25:35 crc kubenswrapper[4898]: I0313 14:25:35.769108 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" path="/var/lib/kubelet/pods/c1d23b78-5402-47e0-8af6-851fcc71be6b/volumes" Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.584770 4898 generic.go:334] "Generic (PLEG): container finished" podID="dd51a575-1651-4891-941f-3e0fe447e81d" containerID="30c39b4868287edbf1ca5987a81c4b7f0ae4a1a7b5fc27bf801a7c4d60e0b3b8" exitCode=0 Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.584826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" event={"ID":"dd51a575-1651-4891-941f-3e0fe447e81d","Type":"ContainerDied","Data":"30c39b4868287edbf1ca5987a81c4b7f0ae4a1a7b5fc27bf801a7c4d60e0b3b8"} Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.585309 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" event={"ID":"dd51a575-1651-4891-941f-3e0fe447e81d","Type":"ContainerStarted","Data":"7bc470273f1aa147ee917555d3600cdc1fb0566974c6cb84da3c33025400b8dd"} Mar 13 14:25:36 crc kubenswrapper[4898]: I0313 14:25:36.739995 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:36 crc kubenswrapper[4898]: E0313 14:25:36.741539 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.598344 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" event={"ID":"dd51a575-1651-4891-941f-3e0fe447e81d","Type":"ContainerStarted","Data":"b16ba020ad81204120e532385217b9c5822d096a3364d32b3184ff5caa745829"} Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.598590 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.600966 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerStarted","Data":"797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa"} Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.648504 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" podStartSLOduration=3.64848231 podStartE2EDuration="3.64848231s" podCreationTimestamp="2026-03-13 14:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:25:37.636936806 +0000 UTC m=+1772.638525045" watchObservedRunningTime="2026-03-13 14:25:37.64848231 +0000 UTC m=+1772.650070549" Mar 13 14:25:37 crc kubenswrapper[4898]: I0313 14:25:37.670039 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-kxtcf" podStartSLOduration=2.105986108 podStartE2EDuration="40.670023916s" podCreationTimestamp="2026-03-13 14:24:57 +0000 UTC" firstStartedPulling="2026-03-13 14:24:58.37560807 +0000 UTC m=+1733.377196309" lastFinishedPulling="2026-03-13 14:25:36.939645878 +0000 UTC m=+1771.941234117" observedRunningTime="2026-03-13 14:25:37.662443487 +0000 UTC m=+1772.664031736" watchObservedRunningTime="2026-03-13 14:25:37.670023916 +0000 UTC m=+1772.671612155" Mar 13 14:25:39 crc kubenswrapper[4898]: I0313 14:25:39.629618 4898 generic.go:334] "Generic (PLEG): container finished" podID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerID="797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa" exitCode=0 Mar 13 14:25:39 crc kubenswrapper[4898]: I0313 14:25:39.629689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerDied","Data":"797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa"} Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.144860 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.248777 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") pod \"2cd78a2a-1bb4-461a-92cd-d705080b087a\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.249013 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") pod \"2cd78a2a-1bb4-461a-92cd-d705080b087a\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.249042 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") pod \"2cd78a2a-1bb4-461a-92cd-d705080b087a\" (UID: \"2cd78a2a-1bb4-461a-92cd-d705080b087a\") " Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.255328 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd" (OuterVolumeSpecName: "kube-api-access-z25rd") pod "2cd78a2a-1bb4-461a-92cd-d705080b087a" (UID: "2cd78a2a-1bb4-461a-92cd-d705080b087a"). InnerVolumeSpecName "kube-api-access-z25rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.285681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd78a2a-1bb4-461a-92cd-d705080b087a" (UID: "2cd78a2a-1bb4-461a-92cd-d705080b087a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.349874 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data" (OuterVolumeSpecName: "config-data") pod "2cd78a2a-1bb4-461a-92cd-d705080b087a" (UID: "2cd78a2a-1bb4-461a-92cd-d705080b087a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.352184 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z25rd\" (UniqueName: \"kubernetes.io/projected/2cd78a2a-1bb4-461a-92cd-d705080b087a-kube-api-access-z25rd\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.352221 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.352235 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd78a2a-1bb4-461a-92cd-d705080b087a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.657762 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-kxtcf" event={"ID":"2cd78a2a-1bb4-461a-92cd-d705080b087a","Type":"ContainerDied","Data":"f0ef8052f16886ece221ecf56528cf884da231c4fa187db604454c6c5925f956"} Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.658141 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0ef8052f16886ece221ecf56528cf884da231c4fa187db604454c6c5925f956" Mar 13 14:25:41 crc kubenswrapper[4898]: I0313 14:25:41.658065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-kxtcf" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.705718 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6b446d7755-5724r"] Mar 13 14:25:42 crc kubenswrapper[4898]: E0313 14:25:42.706350 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706367 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" Mar 13 14:25:42 crc kubenswrapper[4898]: E0313 14:25:42.706402 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerName="heat-db-sync" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706414 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerName="heat-db-sync" Mar 13 14:25:42 crc kubenswrapper[4898]: E0313 14:25:42.706434 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="init" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706446 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="init" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706788 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" containerName="heat-db-sync" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.706817 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d23b78-5402-47e0-8af6-851fcc71be6b" containerName="dnsmasq-dns" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.707916 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.738862 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b446d7755-5724r"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.789880 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-648cbb8b5f-4kb5b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.791964 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.807924 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-648cbb8b5f-4kb5b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.828513 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5df9b5999-7tt4b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.830432 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.848093 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5df9b5999-7tt4b"] Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-public-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894653 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwzqp\" (UniqueName: \"kubernetes.io/projected/0f20ec1d-823e-4695-859e-bdc538e602d9-kube-api-access-fwzqp\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894708 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-combined-ca-bundle\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894786 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894801 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-combined-ca-bundle\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894891 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data-custom\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.894981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-internal-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.895021 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjvx\" (UniqueName: \"kubernetes.io/projected/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-kube-api-access-jrjvx\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.895114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data-custom\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997515 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh8g\" (UniqueName: \"kubernetes.io/projected/03c552ae-5860-4468-a612-7af3d3587df4-kube-api-access-xwh8g\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997586 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data-custom\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997761 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-internal-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997854 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-internal-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.997943 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjvx\" (UniqueName: \"kubernetes.io/projected/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-kube-api-access-jrjvx\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998034 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998154 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-public-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998194 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data-custom\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998322 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-public-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwzqp\" (UniqueName: \"kubernetes.io/projected/0f20ec1d-823e-4695-859e-bdc538e602d9-kube-api-access-fwzqp\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998474 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-combined-ca-bundle\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998604 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-combined-ca-bundle\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998643 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data-custom\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998888 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:42 crc kubenswrapper[4898]: I0313 14:25:42.998924 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-combined-ca-bundle\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.003311 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.003679 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-public-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.003800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-config-data-custom\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.004032 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data-custom\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.004835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-combined-ca-bundle\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.005524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-config-data\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.006554 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f20ec1d-823e-4695-859e-bdc538e602d9-combined-ca-bundle\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.007106 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-internal-tls-certs\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.015486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjvx\" (UniqueName: \"kubernetes.io/projected/739e9c4a-9843-4edf-a045-2f7ef8d15b5e-kube-api-access-jrjvx\") pod \"heat-api-648cbb8b5f-4kb5b\" (UID: \"739e9c4a-9843-4edf-a045-2f7ef8d15b5e\") " pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.019328 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwzqp\" (UniqueName: \"kubernetes.io/projected/0f20ec1d-823e-4695-859e-bdc538e602d9-kube-api-access-fwzqp\") pod \"heat-engine-6b446d7755-5724r\" (UID: \"0f20ec1d-823e-4695-859e-bdc538e602d9\") " pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.047161 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-combined-ca-bundle\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100574 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data-custom\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100646 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh8g\" (UniqueName: \"kubernetes.io/projected/03c552ae-5860-4468-a612-7af3d3587df4-kube-api-access-xwh8g\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100710 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-internal-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.100778 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-public-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.110313 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.110989 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-public-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.123512 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-internal-tls-certs\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.128981 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-combined-ca-bundle\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.128986 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.131117 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03c552ae-5860-4468-a612-7af3d3587df4-config-data-custom\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.134129 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh8g\" (UniqueName: \"kubernetes.io/projected/03c552ae-5860-4468-a612-7af3d3587df4-kube-api-access-xwh8g\") pod \"heat-cfnapi-5df9b5999-7tt4b\" (UID: \"03c552ae-5860-4468-a612-7af3d3587df4\") " pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.148255 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.634201 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b446d7755-5724r"] Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.696098 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b446d7755-5724r" event={"ID":"0f20ec1d-823e-4695-859e-bdc538e602d9","Type":"ContainerStarted","Data":"751319ce2a8b58f9c1a11c8fdc9a52a3f25980adb3450d0601d1a75bc662b822"} Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.781789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-648cbb8b5f-4kb5b"] Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.783144 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:25:43 crc kubenswrapper[4898]: I0313 14:25:43.796956 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5df9b5999-7tt4b"] Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.719453 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-648cbb8b5f-4kb5b" event={"ID":"739e9c4a-9843-4edf-a045-2f7ef8d15b5e","Type":"ContainerStarted","Data":"18c3266dabf2b505cbe40a224487d5d8f44b40eb5270476d38970908a109e756"} Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.721244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" event={"ID":"03c552ae-5860-4468-a612-7af3d3587df4","Type":"ContainerStarted","Data":"1c2cf2943d5fd39baea4d24bd14b6f6875fc9986a3f0e1ebc3fe279a301f3d4d"} Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.723041 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b446d7755-5724r" event={"ID":"0f20ec1d-823e-4695-859e-bdc538e602d9","Type":"ContainerStarted","Data":"e8252fcff7a9c8f96c118fb65e192dfd92a0dc9ac03dbb932eea7baa1e6ec887"} Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.723206 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.945724 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-k4ntr" Mar 13 14:25:44 crc kubenswrapper[4898]: I0313 14:25:44.967250 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6b446d7755-5724r" podStartSLOduration=2.967230565 podStartE2EDuration="2.967230565s" podCreationTimestamp="2026-03-13 14:25:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:25:44.746436941 +0000 UTC m=+1779.748025210" watchObservedRunningTime="2026-03-13 14:25:44.967230565 +0000 UTC m=+1779.968818804" Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.067014 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.067349 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" containerID="cri-o://0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5" gracePeriod=10 Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.738397 4898 generic.go:334] "Generic (PLEG): container finished" podID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerID="0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5" exitCode=0 Mar 13 14:25:45 crc kubenswrapper[4898]: I0313 14:25:45.738415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerDied","Data":"0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5"} Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.619309 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710169 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710479 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710547 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710666 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710699 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710821 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.710977 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") pod \"703503be-2f03-4e95-b4ba-ebdd30b717ee\" (UID: \"703503be-2f03-4e95-b4ba-ebdd30b717ee\") " Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.721401 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f" (OuterVolumeSpecName: "kube-api-access-vs27f") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "kube-api-access-vs27f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.782941 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" event={"ID":"703503be-2f03-4e95-b4ba-ebdd30b717ee","Type":"ContainerDied","Data":"156e4c323450d64b57cc91d4cd576fcfcc3344435ba7b3b650ea24a1251763ee"} Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.782994 4898 scope.go:117] "RemoveContainer" containerID="0b355d36acba975b937e3513e16f6c9b056100e8a7cb8f4e84d596f542de18b5" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.783144 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-r4rvf" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.792698 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.815204 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs27f\" (UniqueName: \"kubernetes.io/projected/703503be-2f03-4e95-b4ba-ebdd30b717ee-kube-api-access-vs27f\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.858098 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.861543 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-648cbb8b5f-4kb5b" podStartSLOduration=2.252938428 podStartE2EDuration="4.86152689s" podCreationTimestamp="2026-03-13 14:25:42 +0000 UTC" firstStartedPulling="2026-03-13 14:25:43.782892333 +0000 UTC m=+1778.784480572" lastFinishedPulling="2026-03-13 14:25:46.391480755 +0000 UTC m=+1781.393069034" observedRunningTime="2026-03-13 14:25:46.840710053 +0000 UTC m=+1781.842298292" watchObservedRunningTime="2026-03-13 14:25:46.86152689 +0000 UTC m=+1781.863115129" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.875361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.875880 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config" (OuterVolumeSpecName: "config") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.879524 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.888938 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921507 4898 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-config\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921539 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921579 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921592 4898 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.921603 4898 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:46 crc kubenswrapper[4898]: I0313 14:25:46.949114 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "703503be-2f03-4e95-b4ba-ebdd30b717ee" (UID: "703503be-2f03-4e95-b4ba-ebdd30b717ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.023916 4898 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/703503be-2f03-4e95-b4ba-ebdd30b717ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.055085 4898 scope.go:117] "RemoveContainer" containerID="ab7dee171473df88511004b0f9cd06f3de427bbb59dba778bc1dbd3f8e29abeb" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.117524 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.149233 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-r4rvf"] Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.756711 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" path="/var/lib/kubelet/pods/703503be-2f03-4e95-b4ba-ebdd30b717ee/volumes" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.809462 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-648cbb8b5f-4kb5b" event={"ID":"739e9c4a-9843-4edf-a045-2f7ef8d15b5e","Type":"ContainerStarted","Data":"02515cf8c295d1ea1e4a384e101d892c550963757aae50ec9210b86c71be25a1"} Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.811833 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" event={"ID":"03c552ae-5860-4468-a612-7af3d3587df4","Type":"ContainerStarted","Data":"e35296bb12f025606d86ea55beb23f4933c6b981cfa16d4b258e576d830c3ddd"} Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.812111 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:47 crc kubenswrapper[4898]: I0313 14:25:47.851383 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" podStartSLOduration=3.254786594 podStartE2EDuration="5.85135465s" podCreationTimestamp="2026-03-13 14:25:42 +0000 UTC" firstStartedPulling="2026-03-13 14:25:43.796788998 +0000 UTC m=+1778.798377237" lastFinishedPulling="2026-03-13 14:25:46.393357054 +0000 UTC m=+1781.394945293" observedRunningTime="2026-03-13 14:25:47.837627829 +0000 UTC m=+1782.839216068" watchObservedRunningTime="2026-03-13 14:25:47.85135465 +0000 UTC m=+1782.852942929" Mar 13 14:25:48 crc kubenswrapper[4898]: I0313 14:25:48.739399 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:48 crc kubenswrapper[4898]: E0313 14:25:48.740130 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:53 crc kubenswrapper[4898]: I0313 14:25:53.116507 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6b446d7755-5724r" Mar 13 14:25:53 crc kubenswrapper[4898]: I0313 14:25:53.202096 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:25:53 crc kubenswrapper[4898]: I0313 14:25:53.202465 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5b6c75676b-jx6kl" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" containerID="cri-o://75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" gracePeriod=60 Mar 13 14:25:54 crc kubenswrapper[4898]: I0313 14:25:54.897164 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-648cbb8b5f-4kb5b" Mar 13 14:25:54 crc kubenswrapper[4898]: I0313 14:25:54.986828 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:25:54 crc kubenswrapper[4898]: I0313 14:25:54.987140 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5f97b49ff6-67dbr" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" containerID="cri-o://357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42" gracePeriod=60 Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.156681 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs"] Mar 13 14:25:55 crc kubenswrapper[4898]: E0313 14:25:55.157263 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="init" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.157281 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="init" Mar 13 14:25:55 crc kubenswrapper[4898]: E0313 14:25:55.157308 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.157315 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.157542 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="703503be-2f03-4e95-b4ba-ebdd30b717ee" containerName="dnsmasq-dns" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.158419 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.161057 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.167284 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.167491 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.167491 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.182667 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs"] Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.238913 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.239415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.239582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.239663 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.341647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.341751 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.341788 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.342293 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.349524 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.353492 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.353954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.363155 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.491992 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.624628 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5df9b5999-7tt4b" Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.707959 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:25:55 crc kubenswrapper[4898]: I0313 14:25:55.708437 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-76b5758c54-vpp67" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" containerID="cri-o://458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" gracePeriod=60 Mar 13 14:25:56 crc kubenswrapper[4898]: I0313 14:25:56.251685 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs"] Mar 13 14:25:56 crc kubenswrapper[4898]: W0313 14:25:56.255005 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98336335_4b60_4ddf_8fe8_4ea6b69d47ef.slice/crio-6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69 WatchSource:0}: Error finding container 6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69: Status 404 returned error can't find the container with id 6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69 Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.814095 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.816295 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.817935 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:25:56 crc kubenswrapper[4898]: E0313 14:25:56.817984 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5b6c75676b-jx6kl" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:25:56 crc kubenswrapper[4898]: I0313 14:25:56.937230 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerStarted","Data":"6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69"} Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.760354 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.772708 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-rr9kw"] Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.956568 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.958216 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.962360 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:25:57 crc kubenswrapper[4898]: I0313 14:25:57.971068 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054076 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054192 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054234 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.054314 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156673 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156749 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156781 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.156887 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.162701 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.163564 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.163979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.185611 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"aodh-db-sync-rzjjp\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.279739 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.486424 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5f97b49ff6-67dbr" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.231:8004/healthcheck\": read tcp 10.217.0.2:57276->10.217.0.231:8004: read: connection reset by peer" Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.879829 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.988917 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerStarted","Data":"f822b9751765ca4646d812007e541e13e282235ebbcdb7da0060744bf5f1941d"} Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.996461 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerID="357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42" exitCode=0 Mar 13 14:25:58 crc kubenswrapper[4898]: I0313 14:25:58.996510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerDied","Data":"357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42"} Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.133488 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.333858 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334012 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334094 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334118 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.334230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") pod \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\" (UID: \"0a9180e2-91e9-4063-83a5-5b4ba75ca011\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.340462 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2" (OuterVolumeSpecName: "kube-api-access-bz9q2") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "kube-api-access-bz9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.370026 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.436931 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.436999 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz9q2\" (UniqueName: \"kubernetes.io/projected/0a9180e2-91e9-4063-83a5-5b4ba75ca011-kube-api-access-bz9q2\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.472704 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data" (OuterVolumeSpecName: "config-data") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.474756 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.503529 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.511682 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0a9180e2-91e9-4063-83a5-5b4ba75ca011" (UID: "0a9180e2-91e9-4063-83a5-5b4ba75ca011"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539052 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539074 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539085 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.539094 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a9180e2-91e9-4063-83a5-5b4ba75ca011-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.742735 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:25:59 crc kubenswrapper[4898]: E0313 14:25:59.743109 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.754065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.784251 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed6478d-e1a3-4587-813f-222e6c4e54d7" path="/var/lib/kubelet/pods/6ed6478d-e1a3-4587-813f-222e6c4e54d7/volumes" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.845565 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.845671 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846099 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846378 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.846885 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") pod \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\" (UID: \"bd18ec2e-1196-4e66-a1c5-9e3daefd7171\") " Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.851235 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.851429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4" (OuterVolumeSpecName: "kube-api-access-tkpj4") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "kube-api-access-tkpj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.902703 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.906495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.925880 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.937105 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data" (OuterVolumeSpecName: "config-data") pod "bd18ec2e-1196-4e66-a1c5-9e3daefd7171" (UID: "bd18ec2e-1196-4e66-a1c5-9e3daefd7171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.951848 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.951884 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.951992 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.952011 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.952023 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkpj4\" (UniqueName: \"kubernetes.io/projected/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-kube-api-access-tkpj4\") on node \"crc\" DevicePath \"\"" Mar 13 14:25:59 crc kubenswrapper[4898]: I0313 14:25:59.952036 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd18ec2e-1196-4e66-a1c5-9e3daefd7171-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.010030 4898 generic.go:334] "Generic (PLEG): container finished" podID="6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835" containerID="6ac994a64cbced8d5ed2ad37e427a3eeb5d4669d67bcb7a943f6946233be58c4" exitCode=0 Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.010095 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerDied","Data":"6ac994a64cbced8d5ed2ad37e427a3eeb5d4669d67bcb7a943f6946233be58c4"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.022491 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f97b49ff6-67dbr" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.022486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f97b49ff6-67dbr" event={"ID":"0a9180e2-91e9-4063-83a5-5b4ba75ca011","Type":"ContainerDied","Data":"f8532eef577636e4f0bae3c5d04fb7af834ed690d3b4263f9713e1e00c75cbcb"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.023034 4898 scope.go:117] "RemoveContainer" containerID="357d95fc5d8e0d7678f3ea6e54b764f618b159ca6eaa129aa3499f515c52ea42" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.028527 4898 generic.go:334] "Generic (PLEG): container finished" podID="8d188301-848c-4cf6-a204-e1110714c1be" containerID="0d8797262833812626f4d3e0e1db3d064a9feac6dd8c4aab149c760269a9a573" exitCode=0 Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.028595 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerDied","Data":"0d8797262833812626f4d3e0e1db3d064a9feac6dd8c4aab149c760269a9a573"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.033879 4898 generic.go:334] "Generic (PLEG): container finished" podID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" exitCode=0 Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.033964 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerDied","Data":"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.034000 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76b5758c54-vpp67" event={"ID":"bd18ec2e-1196-4e66-a1c5-9e3daefd7171","Type":"ContainerDied","Data":"910b4f0096d337a99b11342bf407ba83fa521765a628b5f816cab829a8a6b279"} Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.034065 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76b5758c54-vpp67" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.078321 4898 scope.go:117] "RemoveContainer" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.086225 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.106705 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5f97b49ff6-67dbr"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.132300 4898 scope.go:117] "RemoveContainer" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" Mar 13 14:26:00 crc kubenswrapper[4898]: E0313 14:26:00.151794 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e\": container with ID starting with 458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e not found: ID does not exist" containerID="458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.151852 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e"} err="failed to get container status \"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e\": rpc error: code = NotFound desc = could not find container \"458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e\": container with ID starting with 458d14b015605aca08fe7cf01f1621cdf8583aae425994a9a4ecae32ee37064e not found: ID does not exist" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.164545 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.188002 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76b5758c54-vpp67"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.285035 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:26:00 crc kubenswrapper[4898]: E0313 14:26:00.285692 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.285711 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" Mar 13 14:26:00 crc kubenswrapper[4898]: E0313 14:26:00.285760 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.285770 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.286188 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" containerName="heat-api" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.286242 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" containerName="heat-cfnapi" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.287664 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.328710 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.328965 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.329156 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.349961 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.472277 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"auto-csr-approver-29556866-wnmcp\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.578545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"auto-csr-approver-29556866-wnmcp\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.597516 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"auto-csr-approver-29556866-wnmcp\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:00 crc kubenswrapper[4898]: I0313 14:26:00.677000 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.049424 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835","Type":"ContainerStarted","Data":"27bc08fa0deb28c03994a83019ce147192bbbaed31f89a254828eb70eb13ce4c"} Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.052040 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.057111 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"8d188301-848c-4cf6-a204-e1110714c1be","Type":"ContainerStarted","Data":"a7d95af5df861d956495ba689443c8235e4cfd127a9bbf700f87d52e9cd44b0f"} Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.058540 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.103981 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.103963255 podStartE2EDuration="37.103963255s" podCreationTimestamp="2026-03-13 14:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:26:01.079222395 +0000 UTC m=+1796.080810644" watchObservedRunningTime="2026-03-13 14:26:01.103963255 +0000 UTC m=+1796.105551494" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.112048 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=44.112001707 podStartE2EDuration="44.112001707s" podCreationTimestamp="2026-03-13 14:25:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:26:01.102360313 +0000 UTC m=+1796.103948552" watchObservedRunningTime="2026-03-13 14:26:01.112001707 +0000 UTC m=+1796.113589946" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.218075 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.752124 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9180e2-91e9-4063-83a5-5b4ba75ca011" path="/var/lib/kubelet/pods/0a9180e2-91e9-4063-83a5-5b4ba75ca011/volumes" Mar 13 14:26:01 crc kubenswrapper[4898]: I0313 14:26:01.752685 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd18ec2e-1196-4e66-a1c5-9e3daefd7171" path="/var/lib/kubelet/pods/bd18ec2e-1196-4e66-a1c5-9e3daefd7171/volumes" Mar 13 14:26:02 crc kubenswrapper[4898]: I0313 14:26:02.106503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerStarted","Data":"6d6ef88327f35d5663c9790d3655f3042fd1192ba9cb3ae98574961b538d2fe6"} Mar 13 14:26:06 crc kubenswrapper[4898]: I0313 14:26:06.462849 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.813436 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.815039 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.818500 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 13 14:26:06 crc kubenswrapper[4898]: E0313 14:26:06.818539 4898 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5b6c75676b-jx6kl" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:12 crc kubenswrapper[4898]: I0313 14:26:12.271466 4898 generic.go:334] "Generic (PLEG): container finished" podID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" exitCode=0 Mar 13 14:26:12 crc kubenswrapper[4898]: I0313 14:26:12.271697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerDied","Data":"75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0"} Mar 13 14:26:12 crc kubenswrapper[4898]: I0313 14:26:12.740668 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:12 crc kubenswrapper[4898]: E0313 14:26:12.741734 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:14 crc kubenswrapper[4898]: E0313 14:26:14.561211 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Mar 13 14:26:14 crc kubenswrapper[4898]: E0313 14:26:14.563757 4898 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 14:26:14 crc kubenswrapper[4898]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Mar 13 14:26:14 crc kubenswrapper[4898]: - hosts: all Mar 13 14:26:14 crc kubenswrapper[4898]: strategy: linear Mar 13 14:26:14 crc kubenswrapper[4898]: tasks: Mar 13 14:26:14 crc kubenswrapper[4898]: - name: Enable podified-repos Mar 13 14:26:14 crc kubenswrapper[4898]: become: true Mar 13 14:26:14 crc kubenswrapper[4898]: ansible.builtin.shell: | Mar 13 14:26:14 crc kubenswrapper[4898]: set -euxo pipefail Mar 13 14:26:14 crc kubenswrapper[4898]: pushd /var/tmp Mar 13 14:26:14 crc kubenswrapper[4898]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Mar 13 14:26:14 crc kubenswrapper[4898]: pushd repo-setup-main Mar 13 14:26:14 crc kubenswrapper[4898]: python3 -m venv ./venv Mar 13 14:26:14 crc kubenswrapper[4898]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Mar 13 14:26:14 crc kubenswrapper[4898]: ./venv/bin/repo-setup current-podified -b antelope Mar 13 14:26:14 crc kubenswrapper[4898]: popd Mar 13 14:26:14 crc kubenswrapper[4898]: rm -rf repo-setup-main Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Mar 13 14:26:14 crc kubenswrapper[4898]: edpm_override_hosts: openstack-edpm-ipam Mar 13 14:26:14 crc kubenswrapper[4898]: edpm_service_type: repo-setup Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: Mar 13 14:26:14 crc kubenswrapper[4898]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ff55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs_openstack(98336335-4b60-4ddf-8fe8-4ea6b69d47ef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 13 14:26:14 crc kubenswrapper[4898]: > logger="UnhandledError" Mar 13 14:26:14 crc kubenswrapper[4898]: E0313 14:26:14.567249 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" Mar 13 14:26:14 crc kubenswrapper[4898]: I0313 14:26:14.600261 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 14:26:14 crc kubenswrapper[4898]: I0313 14:26:14.869324 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.021122 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.128935 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.129290 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.129498 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.129569 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") pod \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\" (UID: \"ad94280e-6f02-4129-9cdc-c35499f5d5e4\") " Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.135620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6" (OuterVolumeSpecName: "kube-api-access-kskc6") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "kube-api-access-kskc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.136297 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.192345 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.206692 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data" (OuterVolumeSpecName: "config-data") pod "ad94280e-6f02-4129-9cdc-c35499f5d5e4" (UID: "ad94280e-6f02-4129-9cdc-c35499f5d5e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233643 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233678 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kskc6\" (UniqueName: \"kubernetes.io/projected/ad94280e-6f02-4129-9cdc-c35499f5d5e4-kube-api-access-kskc6\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233688 4898 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.233696 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad94280e-6f02-4129-9cdc-c35499f5d5e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.327769 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5b6c75676b-jx6kl" event={"ID":"ad94280e-6f02-4129-9cdc-c35499f5d5e4","Type":"ContainerDied","Data":"49d0d8e35c38306e9d9d2a68f113990c44c74cb3b5a7d200ee672f1ee07d5629"} Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.327833 4898 scope.go:117] "RemoveContainer" containerID="75f3f93f088797c8af38650d54ae73681b337d83c946b53b20ea72e33b4509c0" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.327792 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5b6c75676b-jx6kl" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.330708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerStarted","Data":"81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf"} Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.333968 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerStarted","Data":"86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634"} Mar 13 14:26:15 crc kubenswrapper[4898]: E0313 14:26:15.338041 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.356567 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" podStartSLOduration=1.957105782 podStartE2EDuration="15.356548787s" podCreationTimestamp="2026-03-13 14:26:00 +0000 UTC" firstStartedPulling="2026-03-13 14:26:01.22816168 +0000 UTC m=+1796.229749919" lastFinishedPulling="2026-03-13 14:26:14.627604675 +0000 UTC m=+1809.629192924" observedRunningTime="2026-03-13 14:26:15.345344962 +0000 UTC m=+1810.346933221" watchObservedRunningTime="2026-03-13 14:26:15.356548787 +0000 UTC m=+1810.358137016" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.374769 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-rzjjp" podStartSLOduration=2.72011789 podStartE2EDuration="18.374749435s" podCreationTimestamp="2026-03-13 14:25:57 +0000 UTC" firstStartedPulling="2026-03-13 14:25:58.941710288 +0000 UTC m=+1793.943298527" lastFinishedPulling="2026-03-13 14:26:14.596341833 +0000 UTC m=+1809.597930072" observedRunningTime="2026-03-13 14:26:15.362844692 +0000 UTC m=+1810.364432931" watchObservedRunningTime="2026-03-13 14:26:15.374749435 +0000 UTC m=+1810.376337674" Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.404840 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.415655 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5b6c75676b-jx6kl"] Mar 13 14:26:15 crc kubenswrapper[4898]: I0313 14:26:15.755484 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" path="/var/lib/kubelet/pods/ad94280e-6f02-4129-9cdc-c35499f5d5e4/volumes" Mar 13 14:26:16 crc kubenswrapper[4898]: I0313 14:26:16.345217 4898 generic.go:334] "Generic (PLEG): container finished" podID="45988deb-1057-4d89-a977-35978404b407" containerID="81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf" exitCode=0 Mar 13 14:26:16 crc kubenswrapper[4898]: I0313 14:26:16.345317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerDied","Data":"81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf"} Mar 13 14:26:17 crc kubenswrapper[4898]: I0313 14:26:17.658072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 13 14:26:17 crc kubenswrapper[4898]: I0313 14:26:17.728288 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:17 crc kubenswrapper[4898]: I0313 14:26:17.873314 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.013646 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") pod \"45988deb-1057-4d89-a977-35978404b407\" (UID: \"45988deb-1057-4d89-a977-35978404b407\") " Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.021122 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh" (OuterVolumeSpecName: "kube-api-access-xs9zh") pod "45988deb-1057-4d89-a977-35978404b407" (UID: "45988deb-1057-4d89-a977-35978404b407"). InnerVolumeSpecName "kube-api-access-xs9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.117112 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs9zh\" (UniqueName: \"kubernetes.io/projected/45988deb-1057-4d89-a977-35978404b407-kube-api-access-xs9zh\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.381109 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" event={"ID":"45988deb-1057-4d89-a977-35978404b407","Type":"ContainerDied","Data":"6d6ef88327f35d5663c9790d3655f3042fd1192ba9cb3ae98574961b538d2fe6"} Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.381174 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d6ef88327f35d5663c9790d3655f3042fd1192ba9cb3ae98574961b538d2fe6" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.381176 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556866-wnmcp" Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.435362 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:26:18 crc kubenswrapper[4898]: I0313 14:26:18.450481 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556860-xbwgj"] Mar 13 14:26:19 crc kubenswrapper[4898]: I0313 14:26:19.392263 4898 generic.go:334] "Generic (PLEG): container finished" podID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerID="86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634" exitCode=0 Mar 13 14:26:19 crc kubenswrapper[4898]: I0313 14:26:19.393316 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerDied","Data":"86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634"} Mar 13 14:26:19 crc kubenswrapper[4898]: I0313 14:26:19.760130 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02521dff-1dee-4839-ab35-a4bfa82bc405" path="/var/lib/kubelet/pods/02521dff-1dee-4839-ab35-a4bfa82bc405/volumes" Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.873073 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.983927 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.984311 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.984373 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.984581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") pod \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\" (UID: \"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe\") " Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.989401 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts" (OuterVolumeSpecName: "scripts") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:20 crc kubenswrapper[4898]: I0313 14:26:20.989469 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh" (OuterVolumeSpecName: "kube-api-access-kdshh") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "kube-api-access-kdshh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.019940 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.034131 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data" (OuterVolumeSpecName: "config-data") pod "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" (UID: "c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087511 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087543 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087555 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.087564 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdshh\" (UniqueName: \"kubernetes.io/projected/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe-kube-api-access-kdshh\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.416738 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-rzjjp" event={"ID":"c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe","Type":"ContainerDied","Data":"f822b9751765ca4646d812007e541e13e282235ebbcdb7da0060744bf5f1941d"} Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.416777 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f822b9751765ca4646d812007e541e13e282235ebbcdb7da0060744bf5f1941d" Mar 13 14:26:21 crc kubenswrapper[4898]: I0313 14:26:21.416834 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-rzjjp" Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.131163 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" containerID="cri-o://122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4" gracePeriod=604796 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.960456 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961215 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" containerID="cri-o://a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860" gracePeriod=30 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961323 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" containerID="cri-o://c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75" gracePeriod=30 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961322 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" containerID="cri-o://f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831" gracePeriod=30 Mar 13 14:26:22 crc kubenswrapper[4898]: I0313 14:26:22.961358 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" containerID="cri-o://8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e" gracePeriod=30 Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.468963 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e" exitCode=0 Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.469006 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860" exitCode=0 Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.469038 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e"} Mar 13 14:26:23 crc kubenswrapper[4898]: I0313 14:26:23.469077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860"} Mar 13 14:26:24 crc kubenswrapper[4898]: I0313 14:26:24.739748 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:24 crc kubenswrapper[4898]: E0313 14:26:24.741471 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.509927 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831" exitCode=0 Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.510489 4898 generic.go:334] "Generic (PLEG): container finished" podID="88246540-ca61-4fb0-8934-c8ebb4559860" containerID="c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75" exitCode=0 Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.510517 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831"} Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.510550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75"} Mar 13 14:26:26 crc kubenswrapper[4898]: I0313 14:26:26.933588 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032407 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032544 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.032738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") pod \"88246540-ca61-4fb0-8934-c8ebb4559860\" (UID: \"88246540-ca61-4fb0-8934-c8ebb4559860\") " Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.039609 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts" (OuterVolumeSpecName: "scripts") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.070361 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9" (OuterVolumeSpecName: "kube-api-access-sxbj9") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "kube-api-access-sxbj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.112256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.136602 4898 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.136632 4898 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.136643 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxbj9\" (UniqueName: \"kubernetes.io/projected/88246540-ca61-4fb0-8934-c8ebb4559860-kube-api-access-sxbj9\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.188083 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.215609 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.238773 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.238807 4898 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.272765 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.276758 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data" (OuterVolumeSpecName: "config-data") pod "88246540-ca61-4fb0-8934-c8ebb4559860" (UID: "88246540-ca61-4fb0-8934-c8ebb4559860"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.340616 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88246540-ca61-4fb0-8934-c8ebb4559860-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.524650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"88246540-ca61-4fb0-8934-c8ebb4559860","Type":"ContainerDied","Data":"4f9554aea31a54e9ad03a3bc5d51fd2b9355c4b2f2434a00fdafabcd84f13b07"} Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.525715 4898 scope.go:117] "RemoveContainer" containerID="f1a7b03523d4185dcbadb339dd340f86f0fe7637d1feff68130acfc4930e6831" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.524711 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.564288 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.566272 4898 scope.go:117] "RemoveContainer" containerID="c83e4fda188ec43992d3ce1b3047566dea50f419ae2e6389d523891cfdc5bf75" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.582855 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.595520 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596082 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596102 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596116 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45988deb-1057-4d89-a977-35978404b407" containerName="oc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596124 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="45988deb-1057-4d89-a977-35978404b407" containerName="oc" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596141 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596147 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596158 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596164 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596174 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596180 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596196 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596203 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" Mar 13 14:26:27 crc kubenswrapper[4898]: E0313 14:26:27.596226 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerName="aodh-db-sync" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596233 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerName="aodh-db-sync" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596463 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="45988deb-1057-4d89-a977-35978404b407" containerName="oc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596477 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-api" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596516 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-evaluator" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596533 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad94280e-6f02-4129-9cdc-c35499f5d5e4" containerName="heat-engine" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596545 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-listener" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596557 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" containerName="aodh-db-sync" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.596583 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" containerName="aodh-notifier" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.599137 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.602502 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.602608 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.602831 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.603008 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-tnpwg" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.605795 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.609359 4898 scope.go:117] "RemoveContainer" containerID="8453994fd2156143da3704e7af63c854727a63f89845c2f4e51b2efe260b622e" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.613974 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.655917 4898 scope.go:117] "RemoveContainer" containerID="a274deee7baf2c38ad5a6692d8a099f9a97dda17b39d57d5a6fb5cd7aca71860" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759441 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759761 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-public-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759800 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-config-data\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.759832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-scripts\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.760095 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-internal-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.760169 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2vv6\" (UniqueName: \"kubernetes.io/projected/a27645af-4d4a-4a73-ba8a-488a9ae199ac-kube-api-access-q2vv6\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.774088 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88246540-ca61-4fb0-8934-c8ebb4559860" path="/var/lib/kubelet/pods/88246540-ca61-4fb0-8934-c8ebb4559860/volumes" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862606 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-internal-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862695 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2vv6\" (UniqueName: \"kubernetes.io/projected/a27645af-4d4a-4a73-ba8a-488a9ae199ac-kube-api-access-q2vv6\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862767 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-public-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862812 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-config-data\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.862828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-scripts\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.866486 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-public-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.867608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-scripts\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.867682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-config-data\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.866510 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.868587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a27645af-4d4a-4a73-ba8a-488a9ae199ac-internal-tls-certs\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.879471 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2vv6\" (UniqueName: \"kubernetes.io/projected/a27645af-4d4a-4a73-ba8a-488a9ae199ac-kube-api-access-q2vv6\") pod \"aodh-0\" (UID: \"a27645af-4d4a-4a73-ba8a-488a9ae199ac\") " pod="openstack/aodh-0" Mar 13 14:26:27 crc kubenswrapper[4898]: I0313 14:26:27.935026 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.541079 4898 generic.go:334] "Generic (PLEG): container finished" podID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerID="122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4" exitCode=0 Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.541154 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerDied","Data":"122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4"} Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.571674 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 13 14:26:28 crc kubenswrapper[4898]: W0313 14:26:28.578268 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27645af_4d4a_4a73_ba8a_488a9ae199ac.slice/crio-069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4 WatchSource:0}: Error finding container 069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4: Status 404 returned error can't find the container with id 069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4 Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.802214 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891137 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891202 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891228 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891288 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891363 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891398 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891530 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891655 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.891725 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") pod \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\" (UID: \"818e3f41-30c4-4a49-b490-0d868fc2b2b8\") " Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.898702 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.898772 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.899150 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.900513 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl" (OuterVolumeSpecName: "kube-api-access-47mbl") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "kube-api-access-47mbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.909000 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.921513 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.923090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info" (OuterVolumeSpecName: "pod-info") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994738 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47mbl\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-kube-api-access-47mbl\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994765 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994775 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994784 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994791 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/818e3f41-30c4-4a49-b490-0d868fc2b2b8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994799 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/818e3f41-30c4-4a49-b490-0d868fc2b2b8-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:28 crc kubenswrapper[4898]: I0313 14:26:28.994809 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.000295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd" (OuterVolumeSpecName: "persistence") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.035372 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data" (OuterVolumeSpecName: "config-data") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.060370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf" (OuterVolumeSpecName: "server-conf") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.097025 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") on node \"crc\" " Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.097058 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.097068 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/818e3f41-30c4-4a49-b490-0d868fc2b2b8-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.132052 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "818e3f41-30c4-4a49-b490-0d868fc2b2b8" (UID: "818e3f41-30c4-4a49-b490-0d868fc2b2b8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.136212 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.136378 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd") on node "crc" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.198922 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.199045 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/818e3f41-30c4-4a49-b490-0d868fc2b2b8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.245304 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.557188 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerStarted","Data":"f46cdf1ffe4a1df3b8f85aa80dd08fcc8e1c7e3fe707ecfa106895fc9d2db9c6"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.561167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"7f9b37fdf2eb88e248e4ae72f997700dd163658a6cd5a8a4795733e36e8a3376"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.561237 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"069caa01f29af2cd1825eb69d7e0a639ef9e88fb94efb0e624df33b776125cc4"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.568986 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"818e3f41-30c4-4a49-b490-0d868fc2b2b8","Type":"ContainerDied","Data":"6fe4bdbf2db945955ec1dd2e86e519172f05f6c43d7d6ac216668fd59e9bda42"} Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.569049 4898 scope.go:117] "RemoveContainer" containerID="122b0ca68068adfd3963153faa26d42ce9ae7ae836229a17a2096dab37be0af4" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.569059 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.587949 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" podStartSLOduration=1.606927483 podStartE2EDuration="34.587922692s" podCreationTimestamp="2026-03-13 14:25:55 +0000 UTC" firstStartedPulling="2026-03-13 14:25:56.258951796 +0000 UTC m=+1791.260540045" lastFinishedPulling="2026-03-13 14:26:29.239947015 +0000 UTC m=+1824.241535254" observedRunningTime="2026-03-13 14:26:29.576836001 +0000 UTC m=+1824.578424240" watchObservedRunningTime="2026-03-13 14:26:29.587922692 +0000 UTC m=+1824.589510931" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.620633 4898 scope.go:117] "RemoveContainer" containerID="6ac94c751f27a4d12d02923377c883f4669b7b2f835e8c6d8eb98e37f2b620ef" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.636405 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.661922 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.681203 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: E0313 14:26:29.682013 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.682124 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" Mar 13 14:26:29 crc kubenswrapper[4898]: E0313 14:26:29.682199 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="setup-container" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.682258 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="setup-container" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.682670 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" containerName="rabbitmq" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.684161 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.705720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.766309 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818e3f41-30c4-4a49-b490-0d868fc2b2b8" path="/var/lib/kubelet/pods/818e3f41-30c4-4a49-b490-0d868fc2b2b8/volumes" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812556 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-config-data\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcbz\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-kube-api-access-drcbz\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812629 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812711 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec19264c-1313-492d-b59b-4e5916b988f5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812779 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812813 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec19264c-1313-492d-b59b-4e5916b988f5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812866 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812908 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812925 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.812947 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.916978 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec19264c-1313-492d-b59b-4e5916b988f5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917107 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917165 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917193 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917228 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917352 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-config-data\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcbz\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-kube-api-access-drcbz\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917416 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917504 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917589 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec19264c-1313-492d-b59b-4e5916b988f5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.917614 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.918106 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.918852 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.919966 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-server-conf\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.921777 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.921851 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ec19264c-1313-492d-b59b-4e5916b988f5-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.924473 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.928764 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ec19264c-1313-492d-b59b-4e5916b988f5-pod-info\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.930091 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ec19264c-1313-492d-b59b-4e5916b988f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.930619 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ec19264c-1313-492d-b59b-4e5916b988f5-config-data\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.937172 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.937201 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/33865dbdc5fe61694c30892e6300309b59f04bdd0b35aa3fd0f17da3ba922194/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 13 14:26:29 crc kubenswrapper[4898]: I0313 14:26:29.938002 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcbz\" (UniqueName: \"kubernetes.io/projected/ec19264c-1313-492d-b59b-4e5916b988f5-kube-api-access-drcbz\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:30 crc kubenswrapper[4898]: I0313 14:26:30.017222 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a6f1f63f-28ac-4fb1-bd87-ea037a28d6cd\") pod \"rabbitmq-server-1\" (UID: \"ec19264c-1313-492d-b59b-4e5916b988f5\") " pod="openstack/rabbitmq-server-1" Mar 13 14:26:30 crc kubenswrapper[4898]: I0313 14:26:30.033234 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 13 14:26:30 crc kubenswrapper[4898]: I0313 14:26:30.817572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 13 14:26:31 crc kubenswrapper[4898]: I0313 14:26:31.604512 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"1d5290d82c438f5f18f9057d3b331c8974b7eacb9a57fbdafce15f3e7ff99476"} Mar 13 14:26:31 crc kubenswrapper[4898]: I0313 14:26:31.607504 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerStarted","Data":"d99fbf55656e2524d19167512de0b192f6409ba22350297a7849a541960322f8"} Mar 13 14:26:32 crc kubenswrapper[4898]: I0313 14:26:32.622716 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"d0820d7a1af073092965965e086611242e5d1849b11f6c9bdac24f9d8c8f5a45"} Mar 13 14:26:33 crc kubenswrapper[4898]: I0313 14:26:33.639876 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerStarted","Data":"20f2f7b753b1aa62a0a0192986eb7c604c4b52e002e15ffc2518d76b86a4ad34"} Mar 13 14:26:33 crc kubenswrapper[4898]: I0313 14:26:33.649430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a27645af-4d4a-4a73-ba8a-488a9ae199ac","Type":"ContainerStarted","Data":"bf5cc7bd1cfdfd09db9bfd9a3a36f84d9ea243b38d5957b5e6359606cf63cae2"} Mar 13 14:26:33 crc kubenswrapper[4898]: I0313 14:26:33.690983 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.169570725 podStartE2EDuration="6.690878715s" podCreationTimestamp="2026-03-13 14:26:27 +0000 UTC" firstStartedPulling="2026-03-13 14:26:28.583080888 +0000 UTC m=+1823.584669127" lastFinishedPulling="2026-03-13 14:26:33.104388878 +0000 UTC m=+1828.105977117" observedRunningTime="2026-03-13 14:26:33.690096414 +0000 UTC m=+1828.691684673" watchObservedRunningTime="2026-03-13 14:26:33.690878715 +0000 UTC m=+1828.692466954" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.139809 4898 scope.go:117] "RemoveContainer" containerID="c91cc1f40aaa9775749654fff4fe79567271db256673b0ded8ef7bcbbed0be52" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.290161 4898 scope.go:117] "RemoveContainer" containerID="2dec706ec3d47e7f4d03ac7b64859e218da33cd45852b8891c58c2ae0bd97657" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.405301 4898 scope.go:117] "RemoveContainer" containerID="ae2833d21b214266cf912f562e6ce0013ae2d27327887e2c914fb19e35f69b2e" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.486008 4898 scope.go:117] "RemoveContainer" containerID="33dd2d6e0ac7d2f137fe32246deb8758bfab8a7e6e24808a6205586e1001969c" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.527437 4898 scope.go:117] "RemoveContainer" containerID="36c5ec42fae468ab48602c979ddc403d899fdca66a51026d30aea73028cc2339" Mar 13 14:26:35 crc kubenswrapper[4898]: I0313 14:26:35.606179 4898 scope.go:117] "RemoveContainer" containerID="41174a57eeced7f987ec1ea67f79e818399b7ca23c9e404468462af2e7f7d393" Mar 13 14:26:38 crc kubenswrapper[4898]: I0313 14:26:38.740211 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:38 crc kubenswrapper[4898]: E0313 14:26:38.741506 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:42 crc kubenswrapper[4898]: I0313 14:26:42.776801 4898 generic.go:334] "Generic (PLEG): container finished" podID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerID="f46cdf1ffe4a1df3b8f85aa80dd08fcc8e1c7e3fe707ecfa106895fc9d2db9c6" exitCode=0 Mar 13 14:26:42 crc kubenswrapper[4898]: I0313 14:26:42.776886 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerDied","Data":"f46cdf1ffe4a1df3b8f85aa80dd08fcc8e1c7e3fe707ecfa106895fc9d2db9c6"} Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.406023 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.500868 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.500964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.500987 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.501013 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") pod \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\" (UID: \"98336335-4b60-4ddf-8fe8-4ea6b69d47ef\") " Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.508032 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55" (OuterVolumeSpecName: "kube-api-access-7ff55") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "kube-api-access-7ff55". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.508266 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.544258 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory" (OuterVolumeSpecName: "inventory") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.552375 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "98336335-4b60-4ddf-8fe8-4ea6b69d47ef" (UID: "98336335-4b60-4ddf-8fe8-4ea6b69d47ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.604994 4898 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.605045 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.605061 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.605077 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ff55\" (UniqueName: \"kubernetes.io/projected/98336335-4b60-4ddf-8fe8-4ea6b69d47ef-kube-api-access-7ff55\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.805553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" event={"ID":"98336335-4b60-4ddf-8fe8-4ea6b69d47ef","Type":"ContainerDied","Data":"6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69"} Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.805618 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6844f40cb1716e51fac4b5bf0efd698139e685b8b7f6778ffb18ee6dd869ba69" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.806130 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.897625 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq"] Mar 13 14:26:44 crc kubenswrapper[4898]: E0313 14:26:44.898441 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.898475 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.899011 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="98336335-4b60-4ddf-8fe8-4ea6b69d47ef" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.900311 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.902675 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.903557 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.903785 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.903861 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:26:44 crc kubenswrapper[4898]: I0313 14:26:44.914999 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq"] Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.018280 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.018371 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.018977 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.122376 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.122650 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.122775 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.127104 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.127550 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.153500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-g6vnq\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.253082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:45 crc kubenswrapper[4898]: I0313 14:26:45.870284 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq"] Mar 13 14:26:46 crc kubenswrapper[4898]: I0313 14:26:46.836363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerStarted","Data":"aa01dc9810d534e87b07f8107f67665e2f7fd6f556aa8ce55e8be5e8b95511ec"} Mar 13 14:26:46 crc kubenswrapper[4898]: I0313 14:26:46.836819 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerStarted","Data":"493fe35f64e9adc97e419785daa4c7d010345822b39f4ae16c4767f16e2efad0"} Mar 13 14:26:46 crc kubenswrapper[4898]: I0313 14:26:46.857338 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" podStartSLOduration=2.3750855189999998 podStartE2EDuration="2.857317846s" podCreationTimestamp="2026-03-13 14:26:44 +0000 UTC" firstStartedPulling="2026-03-13 14:26:45.882757237 +0000 UTC m=+1840.884345476" lastFinishedPulling="2026-03-13 14:26:46.364989554 +0000 UTC m=+1841.366577803" observedRunningTime="2026-03-13 14:26:46.852648143 +0000 UTC m=+1841.854236402" watchObservedRunningTime="2026-03-13 14:26:46.857317846 +0000 UTC m=+1841.858906085" Mar 13 14:26:49 crc kubenswrapper[4898]: I0313 14:26:49.887427 4898 generic.go:334] "Generic (PLEG): container finished" podID="6329b434-b1be-4490-9a50-351366b18d79" containerID="aa01dc9810d534e87b07f8107f67665e2f7fd6f556aa8ce55e8be5e8b95511ec" exitCode=0 Mar 13 14:26:49 crc kubenswrapper[4898]: I0313 14:26:49.888073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerDied","Data":"aa01dc9810d534e87b07f8107f67665e2f7fd6f556aa8ce55e8be5e8b95511ec"} Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.477959 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.623756 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") pod \"6329b434-b1be-4490-9a50-351366b18d79\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.623928 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") pod \"6329b434-b1be-4490-9a50-351366b18d79\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.624000 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") pod \"6329b434-b1be-4490-9a50-351366b18d79\" (UID: \"6329b434-b1be-4490-9a50-351366b18d79\") " Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.630438 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq" (OuterVolumeSpecName: "kube-api-access-nrmdq") pod "6329b434-b1be-4490-9a50-351366b18d79" (UID: "6329b434-b1be-4490-9a50-351366b18d79"). InnerVolumeSpecName "kube-api-access-nrmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.655853 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory" (OuterVolumeSpecName: "inventory") pod "6329b434-b1be-4490-9a50-351366b18d79" (UID: "6329b434-b1be-4490-9a50-351366b18d79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.656244 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6329b434-b1be-4490-9a50-351366b18d79" (UID: "6329b434-b1be-4490-9a50-351366b18d79"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.728077 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.728140 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6329b434-b1be-4490-9a50-351366b18d79-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.728159 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrmdq\" (UniqueName: \"kubernetes.io/projected/6329b434-b1be-4490-9a50-351366b18d79-kube-api-access-nrmdq\") on node \"crc\" DevicePath \"\"" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.740209 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:26:51 crc kubenswrapper[4898]: E0313 14:26:51.740883 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.912455 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" event={"ID":"6329b434-b1be-4490-9a50-351366b18d79","Type":"ContainerDied","Data":"493fe35f64e9adc97e419785daa4c7d010345822b39f4ae16c4767f16e2efad0"} Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.912979 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="493fe35f64e9adc97e419785daa4c7d010345822b39f4ae16c4767f16e2efad0" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.912541 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-g6vnq" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.996889 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf"] Mar 13 14:26:51 crc kubenswrapper[4898]: E0313 14:26:51.997775 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6329b434-b1be-4490-9a50-351366b18d79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.997927 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6329b434-b1be-4490-9a50-351366b18d79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.998367 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6329b434-b1be-4490-9a50-351366b18d79" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 14:26:51 crc kubenswrapper[4898]: I0313 14:26:51.999503 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.002806 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.003121 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.004201 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.004463 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.014773 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf"] Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145516 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145605 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145751 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.145775 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249198 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249363 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249391 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.249486 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.256073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.258732 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.264023 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.269256 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.330360 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:26:52 crc kubenswrapper[4898]: I0313 14:26:52.912458 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf"] Mar 13 14:26:52 crc kubenswrapper[4898]: W0313 14:26:52.923177 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8bbc5a_39da_48b8_82d1_6df496fda612.slice/crio-8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170 WatchSource:0}: Error finding container 8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170: Status 404 returned error can't find the container with id 8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170 Mar 13 14:26:53 crc kubenswrapper[4898]: I0313 14:26:53.937384 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerStarted","Data":"c31d294d1bee94e3955b57cdf27c7683e29803f38822e314dcaef8a3865b3cc6"} Mar 13 14:26:53 crc kubenswrapper[4898]: I0313 14:26:53.937918 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerStarted","Data":"8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170"} Mar 13 14:26:53 crc kubenswrapper[4898]: I0313 14:26:53.961402 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" podStartSLOduration=2.508694848 podStartE2EDuration="2.961378787s" podCreationTimestamp="2026-03-13 14:26:51 +0000 UTC" firstStartedPulling="2026-03-13 14:26:52.929967245 +0000 UTC m=+1847.931555484" lastFinishedPulling="2026-03-13 14:26:53.382651174 +0000 UTC m=+1848.384239423" observedRunningTime="2026-03-13 14:26:53.951864887 +0000 UTC m=+1848.953453126" watchObservedRunningTime="2026-03-13 14:26:53.961378787 +0000 UTC m=+1848.962967026" Mar 13 14:27:05 crc kubenswrapper[4898]: I0313 14:27:05.098336 4898 generic.go:334] "Generic (PLEG): container finished" podID="ec19264c-1313-492d-b59b-4e5916b988f5" containerID="20f2f7b753b1aa62a0a0192986eb7c604c4b52e002e15ffc2518d76b86a4ad34" exitCode=0 Mar 13 14:27:05 crc kubenswrapper[4898]: I0313 14:27:05.098465 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerDied","Data":"20f2f7b753b1aa62a0a0192986eb7c604c4b52e002e15ffc2518d76b86a4ad34"} Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.114644 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"ec19264c-1313-492d-b59b-4e5916b988f5","Type":"ContainerStarted","Data":"65f80466add157bce9629bf6cef532d4e04d34dc2f2d0e6d38d22a1f8fc585ec"} Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.115178 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.151056 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=37.151034723 podStartE2EDuration="37.151034723s" podCreationTimestamp="2026-03-13 14:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:27:06.142885468 +0000 UTC m=+1861.144473737" watchObservedRunningTime="2026-03-13 14:27:06.151034723 +0000 UTC m=+1861.152622962" Mar 13 14:27:06 crc kubenswrapper[4898]: I0313 14:27:06.740107 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:27:06 crc kubenswrapper[4898]: E0313 14:27:06.740765 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:27:20 crc kubenswrapper[4898]: I0313 14:27:20.037190 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 13 14:27:20 crc kubenswrapper[4898]: I0313 14:27:20.147042 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:20 crc kubenswrapper[4898]: I0313 14:27:20.739305 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:27:21 crc kubenswrapper[4898]: I0313 14:27:21.335384 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8"} Mar 13 14:27:24 crc kubenswrapper[4898]: I0313 14:27:24.695294 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" containerID="cri-o://1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655" gracePeriod=604796 Mar 13 14:27:27 crc kubenswrapper[4898]: I0313 14:27:27.130264 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463192 4898 generic.go:334] "Generic (PLEG): container finished" podID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerID="1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655" exitCode=0 Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463779 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerDied","Data":"1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655"} Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463856 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b","Type":"ContainerDied","Data":"6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2"} Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.463867 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d987febc3dce3076fcb022da351b81dbf39766acfe2182896067f106a438fc2" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.466531 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.511889 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513270 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513302 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513335 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513399 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513459 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513499 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513532 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513590 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.513734 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") pod \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\" (UID: \"0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b\") " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.514655 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.514940 4898 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.515300 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.516501 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.523080 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info" (OuterVolumeSpecName: "pod-info") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.529664 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.544069 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw" (OuterVolumeSpecName: "kube-api-access-q8xbw") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "kube-api-access-q8xbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.544747 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.566706 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe" (OuterVolumeSpecName: "persistence") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "pvc-96b86561-77fa-478a-bf61-f7beca9d80fe". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.591469 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data" (OuterVolumeSpecName: "config-data") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617163 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617197 4898 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617208 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617238 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") on node \"crc\" " Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617249 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617260 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8xbw\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-kube-api-access-q8xbw\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617269 4898 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.617278 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.633560 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf" (OuterVolumeSpecName: "server-conf") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.664183 4898 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.664369 4898 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-96b86561-77fa-478a-bf61-f7beca9d80fe" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe") on node "crc" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.694858 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" (UID: "0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.719780 4898 reconciler_common.go:293] "Volume detached for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.719813 4898 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:31 crc kubenswrapper[4898]: I0313 14:27:31.719823 4898 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.475129 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.569003 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.601511 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.619934 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: E0313 14:27:32.620653 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.620677 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" Mar 13 14:27:32 crc kubenswrapper[4898]: E0313 14:27:32.620726 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="setup-container" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.620736 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="setup-container" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.621047 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" containerName="rabbitmq" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.622590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.661640 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-config-data\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740652 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740686 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740727 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740762 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740954 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740978 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.740994 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.741010 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.741030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km2c4\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-kube-api-access-km2c4\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842590 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842738 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842753 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842774 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842790 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km2c4\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-kube-api-access-km2c4\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.842885 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-config-data\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843067 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.843128 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.846987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.847283 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-config-data\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.848062 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.848704 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.850710 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.851021 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.852029 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.853792 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.854836 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.855839 4898 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.855866 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/235b7df56c251cb078c850d3b743a7085fdda6b090aa4cee8a1308b947278440/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.870626 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km2c4\" (UniqueName: \"kubernetes.io/projected/10c321a0-5ea5-4b5c-8695-1f7b2dcad32b-kube-api-access-km2c4\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.925025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-96b86561-77fa-478a-bf61-f7beca9d80fe\") pod \"rabbitmq-server-0\" (UID: \"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b\") " pod="openstack/rabbitmq-server-0" Mar 13 14:27:32 crc kubenswrapper[4898]: I0313 14:27:32.948500 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 14:27:33 crc kubenswrapper[4898]: I0313 14:27:33.536338 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 14:27:33 crc kubenswrapper[4898]: I0313 14:27:33.753351 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b" path="/var/lib/kubelet/pods/0fa7fb2f-de19-48b1-8226-d7f85a5f8f2b/volumes" Mar 13 14:27:34 crc kubenswrapper[4898]: I0313 14:27:34.507329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerStarted","Data":"05b22667549dcdbfdb3e670ac4505629368dda11f5bddf5b6fe16358c1ffbb17"} Mar 13 14:27:35 crc kubenswrapper[4898]: I0313 14:27:35.901140 4898 scope.go:117] "RemoveContainer" containerID="213db1fb491a2ed6d8dc3d15759978456b725ffef29e1be38661bf279db1daf8" Mar 13 14:27:35 crc kubenswrapper[4898]: I0313 14:27:35.938036 4898 scope.go:117] "RemoveContainer" containerID="6584acdbfa3b269b10be5eacbee652dc5b87853d5dd4647683e70850466d55d5" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.024112 4898 scope.go:117] "RemoveContainer" containerID="686c9d5260a554140660fb899d995f27d4b2bd420d76c710ada3057e3122cfaf" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.053451 4898 scope.go:117] "RemoveContainer" containerID="54431181e3eb3b359d0852277dd7b5d798e5ebd64616fc6928567613ce28f709" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.105818 4898 scope.go:117] "RemoveContainer" containerID="8e18090ad1757c0b15ba6a519121358ec8fea5c9816c6426d3dd165832b431af" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.139721 4898 scope.go:117] "RemoveContainer" containerID="e1e9eae3fcad1899b5e4ca6e0e525d3dd9661ab59290e1ba7355e6176cbc69f0" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.186861 4898 scope.go:117] "RemoveContainer" containerID="9dee453ab58c346884f20f75be928701596b9ae0bdbbb025979e1e2daaa907c1" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.232625 4898 scope.go:117] "RemoveContainer" containerID="1483af6a8a31cc8d457a629a4f59e975d8c4af4c9fe53636f469b1c72aabc655" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.256078 4898 scope.go:117] "RemoveContainer" containerID="8699a974ed046c0e546ec06ad74b2baeebb668b44968353911ae1775a60f7c87" Mar 13 14:27:36 crc kubenswrapper[4898]: I0313 14:27:36.538211 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerStarted","Data":"816bc1a0bbcb90a875482c4dd5f17ddca4bce3587b6ba90f30bd3e5b12de16d2"} Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.160888 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.164257 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.170296 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.170395 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.170550 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.177549 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.201764 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"auto-csr-approver-29556868-9gt46\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.304295 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"auto-csr-approver-29556868-9gt46\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.330628 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"auto-csr-approver-29556868-9gt46\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.505362 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:00 crc kubenswrapper[4898]: I0313 14:28:00.996596 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:28:00 crc kubenswrapper[4898]: W0313 14:28:00.999634 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9565fbbb_2765_4ffb_a934_e5ddf9be1d17.slice/crio-b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104 WatchSource:0}: Error finding container b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104: Status 404 returned error can't find the container with id b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104 Mar 13 14:28:01 crc kubenswrapper[4898]: I0313 14:28:01.903193 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556868-9gt46" event={"ID":"9565fbbb-2765-4ffb-a934-e5ddf9be1d17","Type":"ContainerStarted","Data":"b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104"} Mar 13 14:28:02 crc kubenswrapper[4898]: I0313 14:28:02.948316 4898 generic.go:334] "Generic (PLEG): container finished" podID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerID="23e456f4a6227ca0f6e6f99f4c35a21b09d57519ec2a733d94a113420fb1a340" exitCode=0 Mar 13 14:28:02 crc kubenswrapper[4898]: I0313 14:28:02.948371 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556868-9gt46" event={"ID":"9565fbbb-2765-4ffb-a934-e5ddf9be1d17","Type":"ContainerDied","Data":"23e456f4a6227ca0f6e6f99f4c35a21b09d57519ec2a733d94a113420fb1a340"} Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.480288 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.531563 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") pod \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\" (UID: \"9565fbbb-2765-4ffb-a934-e5ddf9be1d17\") " Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.555624 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24" (OuterVolumeSpecName: "kube-api-access-gkd24") pod "9565fbbb-2765-4ffb-a934-e5ddf9be1d17" (UID: "9565fbbb-2765-4ffb-a934-e5ddf9be1d17"). InnerVolumeSpecName "kube-api-access-gkd24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.636396 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkd24\" (UniqueName: \"kubernetes.io/projected/9565fbbb-2765-4ffb-a934-e5ddf9be1d17-kube-api-access-gkd24\") on node \"crc\" DevicePath \"\"" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.981665 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556868-9gt46" event={"ID":"9565fbbb-2765-4ffb-a934-e5ddf9be1d17","Type":"ContainerDied","Data":"b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104"} Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.982019 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b291bcca631f9ab111749ea520821bf3bb3a186499771a638e9ab9007b740104" Mar 13 14:28:04 crc kubenswrapper[4898]: I0313 14:28:04.981725 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556868-9gt46" Mar 13 14:28:05 crc kubenswrapper[4898]: I0313 14:28:05.583918 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:28:05 crc kubenswrapper[4898]: I0313 14:28:05.599757 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556862-mpx4w"] Mar 13 14:28:05 crc kubenswrapper[4898]: I0313 14:28:05.756921 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c85cb04-363e-45d6-a14b-79c249e8f469" path="/var/lib/kubelet/pods/3c85cb04-363e-45d6-a14b-79c249e8f469/volumes" Mar 13 14:28:09 crc kubenswrapper[4898]: I0313 14:28:09.037645 4898 generic.go:334] "Generic (PLEG): container finished" podID="10c321a0-5ea5-4b5c-8695-1f7b2dcad32b" containerID="816bc1a0bbcb90a875482c4dd5f17ddca4bce3587b6ba90f30bd3e5b12de16d2" exitCode=0 Mar 13 14:28:09 crc kubenswrapper[4898]: I0313 14:28:09.037703 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerDied","Data":"816bc1a0bbcb90a875482c4dd5f17ddca4bce3587b6ba90f30bd3e5b12de16d2"} Mar 13 14:28:10 crc kubenswrapper[4898]: I0313 14:28:10.053456 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10c321a0-5ea5-4b5c-8695-1f7b2dcad32b","Type":"ContainerStarted","Data":"1975e8515d40d0e0882f9032c3cf285be83ffad9d05ae4d3245e76efc73f0dda"} Mar 13 14:28:10 crc kubenswrapper[4898]: I0313 14:28:10.054263 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 14:28:10 crc kubenswrapper[4898]: I0313 14:28:10.109028 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.109003836 podStartE2EDuration="38.109003836s" podCreationTimestamp="2026-03-13 14:27:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:28:10.086340291 +0000 UTC m=+1925.087928560" watchObservedRunningTime="2026-03-13 14:28:10.109003836 +0000 UTC m=+1925.110592075" Mar 13 14:28:22 crc kubenswrapper[4898]: I0313 14:28:22.952119 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.419432 4898 scope.go:117] "RemoveContainer" containerID="b62d7bd0ca3497c43d915b6212935946bc82ac1a3defe8c89eeb3779d6ce9770" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.455968 4898 scope.go:117] "RemoveContainer" containerID="ba94a825cfb36ee16c3e15907274f9276083ba448d310d471374f19c54cc116c" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.487311 4898 scope.go:117] "RemoveContainer" containerID="0d9a86d7e906b1015484fb8d8fc360af30fd3aeeed5cfca12314a74be70b110a" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.547459 4898 scope.go:117] "RemoveContainer" containerID="cea1936f2758016544cbefa24e4ca686c3e33acfdaf019898c501a90320d0242" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.582165 4898 scope.go:117] "RemoveContainer" containerID="a55576c9a44e83505bf8757afc0e1e19424b4717e80f08c508a794c81f2cfdb0" Mar 13 14:28:36 crc kubenswrapper[4898]: I0313 14:28:36.608478 4898 scope.go:117] "RemoveContainer" containerID="c6cbd243a4ab0ae3ee88d2b14b07e0b9c8fda594949edb73fc92248b8f25ddf9" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.729997 4898 scope.go:117] "RemoveContainer" containerID="c0fcc6916c9c7951ac6f57b54e64b861fe8be03a65443f0a0008c4f458405d78" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.756091 4898 scope.go:117] "RemoveContainer" containerID="9bd9f3f02e15571b11f72778527812906d168be88196cc4314aa88f5c276ac6c" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.784231 4898 scope.go:117] "RemoveContainer" containerID="ceacbe8778fcc11e62876d98d598259e725fa8302adf85af1d1ddc9df4d62ff6" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.823200 4898 scope.go:117] "RemoveContainer" containerID="a0be402bfe00c68e23ab47a73d2d201566aad9d451ecaff23e8fc2d99923064b" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.908416 4898 scope.go:117] "RemoveContainer" containerID="7d83563b664dd524f060763bd5deadd7009b47fedbc88f53844517f4c00a64ea" Mar 13 14:29:36 crc kubenswrapper[4898]: I0313 14:29:36.957241 4898 scope.go:117] "RemoveContainer" containerID="4d8bf97f6df1c8578abf9d8b2ff9a16d6f36d0a198628e241eb7ec672c4d77c5" Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.008257 4898 scope.go:117] "RemoveContainer" containerID="b97169c17bbe4e26153e2ab8b910eb60fbafe00f2669ecd0f29ce4eb8dba08e0" Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.070398 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.082795 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.102825 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-n7qmc"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.121599 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-baf6-account-create-update-xhptm"] Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.765399 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45215dff-dfeb-4b68-bc5c-d36aba0ea6b8" path="/var/lib/kubelet/pods/45215dff-dfeb-4b68-bc5c-d36aba0ea6b8/volumes" Mar 13 14:29:37 crc kubenswrapper[4898]: I0313 14:29:37.770060 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f4ee1a-c4d2-415d-9021-6503f03f8441" path="/var/lib/kubelet/pods/81f4ee1a-c4d2-415d-9021-6503f03f8441/volumes" Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.037472 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.054932 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.067711 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-621e-account-create-update-dksd9"] Mar 13 14:29:40 crc kubenswrapper[4898]: I0313 14:29:40.079985 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-cdnq7"] Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.308632 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerID="c31d294d1bee94e3955b57cdf27c7683e29803f38822e314dcaef8a3865b3cc6" exitCode=0 Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.309006 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerDied","Data":"c31d294d1bee94e3955b57cdf27c7683e29803f38822e314dcaef8a3865b3cc6"} Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.764652 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bdafe7-9c43-4acc-a212-864bdf38d5b4" path="/var/lib/kubelet/pods/59bdafe7-9c43-4acc-a212-864bdf38d5b4/volumes" Mar 13 14:29:41 crc kubenswrapper[4898]: I0313 14:29:41.767708 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8c46fcc-fd9b-4073-99e6-28aadcdd823e" path="/var/lib/kubelet/pods/a8c46fcc-fd9b-4073-99e6-28aadcdd823e/volumes" Mar 13 14:29:42 crc kubenswrapper[4898]: I0313 14:29:42.938316 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.032855 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.033208 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.033837 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.034025 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") pod \"6d8bbc5a-39da-48b8-82d1-6df496fda612\" (UID: \"6d8bbc5a-39da-48b8-82d1-6df496fda612\") " Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.039275 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d" (OuterVolumeSpecName: "kube-api-access-cs68d") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "kube-api-access-cs68d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.041079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.068199 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.081767 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory" (OuterVolumeSpecName: "inventory") pod "6d8bbc5a-39da-48b8-82d1-6df496fda612" (UID: "6d8bbc5a-39da-48b8-82d1-6df496fda612"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136288 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136323 4898 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136338 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs68d\" (UniqueName: \"kubernetes.io/projected/6d8bbc5a-39da-48b8-82d1-6df496fda612-kube-api-access-cs68d\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.136351 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d8bbc5a-39da-48b8-82d1-6df496fda612-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.337414 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" event={"ID":"6d8bbc5a-39da-48b8-82d1-6df496fda612","Type":"ContainerDied","Data":"8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170"} Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.337724 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b67963947b518457f04c307420f088501a95221935f61209e26215c56765170" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.337921 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.457961 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg"] Mar 13 14:29:43 crc kubenswrapper[4898]: E0313 14:29:43.458613 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerName="oc" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.458634 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerName="oc" Mar 13 14:29:43 crc kubenswrapper[4898]: E0313 14:29:43.458702 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.458712 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.459032 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" containerName="oc" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.459058 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8bbc5a-39da-48b8-82d1-6df496fda612" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.460937 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.463832 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.464059 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.464356 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.464764 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.470933 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg"] Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.646667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.646768 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.646864 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.748889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.749699 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.749973 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.756119 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.757230 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.779349 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-rbttg\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:43 crc kubenswrapper[4898]: I0313 14:29:43.789309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:29:44 crc kubenswrapper[4898]: W0313 14:29:44.465509 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05e315eb_34b1_4099_b676_b0238f3cb5c5.slice/crio-56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468 WatchSource:0}: Error finding container 56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468: Status 404 returned error can't find the container with id 56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468 Mar 13 14:29:44 crc kubenswrapper[4898]: I0313 14:29:44.474333 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg"] Mar 13 14:29:45 crc kubenswrapper[4898]: I0313 14:29:45.369842 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerStarted","Data":"56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468"} Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.050139 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.065374 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.078716 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d2b7-account-create-update-ggzw8"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.090727 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ppqg7"] Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.403719 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerStarted","Data":"19401614e37155924147d2b68ea4d922bec3bbc6d22b8dd77602f823a33552a5"} Mar 13 14:29:46 crc kubenswrapper[4898]: I0313 14:29:46.433867 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" podStartSLOduration=2.937411699 podStartE2EDuration="3.433837683s" podCreationTimestamp="2026-03-13 14:29:43 +0000 UTC" firstStartedPulling="2026-03-13 14:29:44.46841864 +0000 UTC m=+2019.470006889" lastFinishedPulling="2026-03-13 14:29:44.964844584 +0000 UTC m=+2019.966432873" observedRunningTime="2026-03-13 14:29:46.429708066 +0000 UTC m=+2021.431296345" watchObservedRunningTime="2026-03-13 14:29:46.433837683 +0000 UTC m=+2021.435425952" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.051421 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.068235 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-35b4-account-create-update-7rdfs"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.083320 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.097877 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-zzflk"] Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.753886 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf799d3-e4d4-439d-b3da-d5467064f6f1" path="/var/lib/kubelet/pods/0bf799d3-e4d4-439d-b3da-d5467064f6f1/volumes" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.754824 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a060a9-dd52-4192-bc48-b9ea7a918458" path="/var/lib/kubelet/pods/32a060a9-dd52-4192-bc48-b9ea7a918458/volumes" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.755785 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc61df36-ac68-4cf0-9456-140bccb5435c" path="/var/lib/kubelet/pods/bc61df36-ac68-4cf0-9456-140bccb5435c/volumes" Mar 13 14:29:47 crc kubenswrapper[4898]: I0313 14:29:47.757309 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58c984f-f43f-42dc-90a5-aebbe79a47a5" path="/var/lib/kubelet/pods/f58c984f-f43f-42dc-90a5-aebbe79a47a5/volumes" Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.053772 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.062933 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.071806 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-ce03-account-create-update-425qr"] Mar 13 14:29:48 crc kubenswrapper[4898]: I0313 14:29:48.106207 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-n82b8"] Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.134226 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.134511 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.762645 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="586ccc66-1989-46e5-98ad-b70c7e88e6bc" path="/var/lib/kubelet/pods/586ccc66-1989-46e5-98ad-b70c7e88e6bc/volumes" Mar 13 14:29:49 crc kubenswrapper[4898]: I0313 14:29:49.764133 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba5ed93a-91b4-4942-a32c-ab02a536e3d4" path="/var/lib/kubelet/pods/ba5ed93a-91b4-4942-a32c-ab02a536e3d4/volumes" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.173785 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.181814 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.188498 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.188736 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.195601 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.198077 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.207546 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.207606 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.207543 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.233415 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.238266 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.238356 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.238381 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.244652 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340268 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"auto-csr-approver-29556870-ndlzh\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340374 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340456 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.340483 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.341386 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.347500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.370563 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"collect-profiles-29556870-v7r2p\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.444338 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"auto-csr-approver-29556870-ndlzh\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.485873 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"auto-csr-approver-29556870-ndlzh\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.534972 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:00 crc kubenswrapper[4898]: I0313 14:30:00.551485 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.097154 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 14:30:01 crc kubenswrapper[4898]: W0313 14:30:01.105021 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod924ed9fd_c9e5_4462_9b97_6d6cd1e8ea19.slice/crio-c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32 WatchSource:0}: Error finding container c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32: Status 404 returned error can't find the container with id c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32 Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.154789 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:30:01 crc kubenswrapper[4898]: W0313 14:30:01.163138 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f21c0b_a6a1_4b44_ae38_4a382569154e.slice/crio-1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f WatchSource:0}: Error finding container 1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f: Status 404 returned error can't find the container with id 1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.626159 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerStarted","Data":"1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f"} Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.628694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerStarted","Data":"64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc"} Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.628750 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerStarted","Data":"c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32"} Mar 13 14:30:01 crc kubenswrapper[4898]: I0313 14:30:01.665828 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" podStartSLOduration=1.665798295 podStartE2EDuration="1.665798295s" podCreationTimestamp="2026-03-13 14:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 14:30:01.649976394 +0000 UTC m=+2036.651564663" watchObservedRunningTime="2026-03-13 14:30:01.665798295 +0000 UTC m=+2036.667386574" Mar 13 14:30:02 crc kubenswrapper[4898]: I0313 14:30:02.648489 4898 generic.go:334] "Generic (PLEG): container finished" podID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerID="64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc" exitCode=0 Mar 13 14:30:02 crc kubenswrapper[4898]: I0313 14:30:02.648840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerDied","Data":"64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc"} Mar 13 14:30:03 crc kubenswrapper[4898]: I0313 14:30:03.662987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerStarted","Data":"b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd"} Mar 13 14:30:03 crc kubenswrapper[4898]: I0313 14:30:03.682822 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" podStartSLOduration=2.218463383 podStartE2EDuration="3.682795319s" podCreationTimestamp="2026-03-13 14:30:00 +0000 UTC" firstStartedPulling="2026-03-13 14:30:01.165091079 +0000 UTC m=+2036.166679338" lastFinishedPulling="2026-03-13 14:30:02.629423015 +0000 UTC m=+2037.631011274" observedRunningTime="2026-03-13 14:30:03.680074368 +0000 UTC m=+2038.681662607" watchObservedRunningTime="2026-03-13 14:30:03.682795319 +0000 UTC m=+2038.684383598" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.005803 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.146308 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") pod \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.146689 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") pod \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.146758 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") pod \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\" (UID: \"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19\") " Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.147966 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume" (OuterVolumeSpecName: "config-volume") pod "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" (UID: "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.152357 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" (UID: "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.152508 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q" (OuterVolumeSpecName: "kube-api-access-5rf6q") pod "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" (UID: "924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19"). InnerVolumeSpecName "kube-api-access-5rf6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.250157 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.250202 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rf6q\" (UniqueName: \"kubernetes.io/projected/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-kube-api-access-5rf6q\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.250215 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.687505 4898 generic.go:334] "Generic (PLEG): container finished" podID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerID="b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd" exitCode=0 Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.688120 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerDied","Data":"b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd"} Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.693239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" event={"ID":"924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19","Type":"ContainerDied","Data":"c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32"} Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.693303 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c035e8c8fdb2f9bf143370be15075f63ee2778c239b799bd630e23b7df793c32" Mar 13 14:30:04 crc kubenswrapper[4898]: I0313 14:30:04.693413 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.109508 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.204886 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") pod \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\" (UID: \"c4f21c0b-a6a1-4b44-ae38-4a382569154e\") " Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.210826 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp" (OuterVolumeSpecName: "kube-api-access-wpjbp") pod "c4f21c0b-a6a1-4b44-ae38-4a382569154e" (UID: "c4f21c0b-a6a1-4b44-ae38-4a382569154e"). InnerVolumeSpecName "kube-api-access-wpjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.308023 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpjbp\" (UniqueName: \"kubernetes.io/projected/c4f21c0b-a6a1-4b44-ae38-4a382569154e-kube-api-access-wpjbp\") on node \"crc\" DevicePath \"\"" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.717810 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" event={"ID":"c4f21c0b-a6a1-4b44-ae38-4a382569154e","Type":"ContainerDied","Data":"1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f"} Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.717856 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1309792d992296d5da9b2b9feb7fd0fd0cb4a0e583dd0d5108ab2a673b6d9c0f" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.717943 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556870-ndlzh" Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.759226 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:30:06 crc kubenswrapper[4898]: I0313 14:30:06.774167 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556864-bvpnd"] Mar 13 14:30:07 crc kubenswrapper[4898]: I0313 14:30:07.756179 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5f381c-bbd8-40d9-8c76-efee5fb7023a" path="/var/lib/kubelet/pods/4e5f381c-bbd8-40d9-8c76-efee5fb7023a/volumes" Mar 13 14:30:14 crc kubenswrapper[4898]: I0313 14:30:14.032170 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:30:14 crc kubenswrapper[4898]: I0313 14:30:14.042783 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-x75zk"] Mar 13 14:30:15 crc kubenswrapper[4898]: I0313 14:30:15.756393 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d4aeca-15ec-4f63-87e0-20daa6f3e70f" path="/var/lib/kubelet/pods/74d4aeca-15ec-4f63-87e0-20daa6f3e70f/volumes" Mar 13 14:30:19 crc kubenswrapper[4898]: I0313 14:30:19.135213 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:30:19 crc kubenswrapper[4898]: I0313 14:30:19.135888 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.070064 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.094971 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.106974 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.126054 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-88gdv"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.141103 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-4e00-account-create-update-92bgz"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.163726 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-275nk"] Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.764802 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe3416e-f08a-43c9-8e12-a89c1e849208" path="/var/lib/kubelet/pods/4fe3416e-f08a-43c9-8e12-a89c1e849208/volumes" Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.767705 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04d3edd-a550-465a-9ef2-2cbea4126ceb" path="/var/lib/kubelet/pods/b04d3edd-a550-465a-9ef2-2cbea4126ceb/volumes" Mar 13 14:30:25 crc kubenswrapper[4898]: I0313 14:30:25.771410 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8a8516c-5aee-4eae-a59b-498f97c1b92b" path="/var/lib/kubelet/pods/f8a8516c-5aee-4eae-a59b-498f97c1b92b/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.065241 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.083194 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-23f7-account-create-update-z479t"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.095668 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.104938 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-9d98-account-create-update-t77sf"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.115827 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.124825 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.133734 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.143504 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-95vbj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.153847 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1082-account-create-update-2jjkd"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.163248 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-452kj"] Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.757480 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa06f21-2d35-4d03-86b9-01d9354826da" path="/var/lib/kubelet/pods/1aa06f21-2d35-4d03-86b9-01d9354826da/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.760650 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71459d1c-2acb-4e15-a30d-09dd0f7f7951" path="/var/lib/kubelet/pods/71459d1c-2acb-4e15-a30d-09dd0f7f7951/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.763688 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b83b860f-ed6c-46b2-862a-fbda9af7dc89" path="/var/lib/kubelet/pods/b83b860f-ed6c-46b2-862a-fbda9af7dc89/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.767591 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea88065-1eff-42e2-809a-443c15bda0ac" path="/var/lib/kubelet/pods/bea88065-1eff-42e2-809a-443c15bda0ac/volumes" Mar 13 14:30:29 crc kubenswrapper[4898]: I0313 14:30:29.770847 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d" path="/var/lib/kubelet/pods/f9e4fe10-07d1-4f4e-bb04-5ab11b8ad56d/volumes" Mar 13 14:30:32 crc kubenswrapper[4898]: I0313 14:30:32.035703 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:30:32 crc kubenswrapper[4898]: I0313 14:30:32.048037 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pxtss"] Mar 13 14:30:33 crc kubenswrapper[4898]: I0313 14:30:33.762484 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74eb351d-364c-4564-8f8b-67ac844a6abc" path="/var/lib/kubelet/pods/74eb351d-364c-4564-8f8b-67ac844a6abc/volumes" Mar 13 14:30:34 crc kubenswrapper[4898]: I0313 14:30:34.046697 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:30:34 crc kubenswrapper[4898]: I0313 14:30:34.066729 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-dxpl9"] Mar 13 14:30:35 crc kubenswrapper[4898]: I0313 14:30:35.767553 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdab36c-41db-4a9c-9cbe-47e1761c6df5" path="/var/lib/kubelet/pods/8fdab36c-41db-4a9c-9cbe-47e1761c6df5/volumes" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.160394 4898 scope.go:117] "RemoveContainer" containerID="f8a3e423bebf88995b6d32dd30a81abd86b4e9eab9359f63d75400abe5906505" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.242057 4898 scope.go:117] "RemoveContainer" containerID="66d662834083b3a8826084dc54618cf384de1f9336d6d06012f43689e8e15545" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.286623 4898 scope.go:117] "RemoveContainer" containerID="860f247abf99986a680fa9cbd71b3ddb7e0e1a4bc671f3a1ca2277312ff69005" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.347367 4898 scope.go:117] "RemoveContainer" containerID="4e72b0c7dc05dd72f43622aacb47475aacbf01f3e30ece85d27e66c47011712e" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.409671 4898 scope.go:117] "RemoveContainer" containerID="190741a9e70699bd53ad4219ca7d5f504afce181f13fef8f81fc17d9e1a70095" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.470070 4898 scope.go:117] "RemoveContainer" containerID="58affd3294e9aec78373844bf6912651079de0e76c0d060a1cf7a048a7bc787d" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.517410 4898 scope.go:117] "RemoveContainer" containerID="7890927e1d3da3f2b5ae266b74631a44cf3eea829b7cc9b79f5ffe9476b7f6a0" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.539382 4898 scope.go:117] "RemoveContainer" containerID="80f19b54532d031429e005ad3c0cf3c32cf378c8ddab795769c6087368772b42" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.562278 4898 scope.go:117] "RemoveContainer" containerID="df7deb06b863e0990f2d81a2c25739f29fd7b5122c0d32c2314963f2204b89f3" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.591401 4898 scope.go:117] "RemoveContainer" containerID="89083a9a998b87f99f34c5645b2e669a886f2f7b20e68795829134c5acb24ac6" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.620840 4898 scope.go:117] "RemoveContainer" containerID="f93c4099691f41a073e964731ace4e3b38e62bd41c90cb8a30591394175252ce" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.665202 4898 scope.go:117] "RemoveContainer" containerID="f7ee82c0bf1917642e2854f8aafb4c28fc338ae0b377633b5ccd89fd1a0294f4" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.695608 4898 scope.go:117] "RemoveContainer" containerID="9199e9b5bfad44aa55bebdeb17820a815d88bcb2d174de10793bf2e6e2845fc2" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.727762 4898 scope.go:117] "RemoveContainer" containerID="f40862770a13b51268c0d1e7b9c2896a6335c6f3b3801074579c313c3d13577e" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.752046 4898 scope.go:117] "RemoveContainer" containerID="592e2145b8382848d105acb5e5275b8dd688df9ab9d3d5caa34a389dd2742086" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.794843 4898 scope.go:117] "RemoveContainer" containerID="81635cb2b5569daf242649d4672547e48f7dd06f334c45c3f33ff7ef44a8ec5f" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.825014 4898 scope.go:117] "RemoveContainer" containerID="e22a5e1114b923439d9f143c3b64b35dbca324bd3cf616ce49fd0caf5c66d873" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.848604 4898 scope.go:117] "RemoveContainer" containerID="e6b8ff442a61a4fcf1565b308b6597fe09fb7264763f211d7540bb2a249e6c54" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.900018 4898 scope.go:117] "RemoveContainer" containerID="a88dfb7f27e61d13c4de942fec09301aec287d6967f6ca991e690f9c9c77a8e1" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.920795 4898 scope.go:117] "RemoveContainer" containerID="4839c26bbb3360becfb71db51ead56738498d1508d530ccce036f032e975f9b4" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.965153 4898 scope.go:117] "RemoveContainer" containerID="3c9e54e03326bbf4751dd95fa7e9b1825e9af82b1eefdf09759081304dd57de9" Mar 13 14:30:37 crc kubenswrapper[4898]: I0313 14:30:37.994331 4898 scope.go:117] "RemoveContainer" containerID="9d4412724b9aeab2fb38e3b120d6b80b5959d7a8a33631247f92a023f5b56a70" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.022303 4898 scope.go:117] "RemoveContainer" containerID="b9cd470c8d031dbf9ec18b998e1bb21c765853e9617c7315b08f8356dad9258f" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.050319 4898 scope.go:117] "RemoveContainer" containerID="1b7f5b79b9cbc006ae8ac33cf3d709c72ff92b310ff2867c772246e7be6d5aff" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.075041 4898 scope.go:117] "RemoveContainer" containerID="5f41cf46e25c4bfb4c3cb12738ce8eaa95c275888249cc79416494629ec3b64b" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.102562 4898 scope.go:117] "RemoveContainer" containerID="c12b1a7614a831f1604c6feaf4bf42c52496fffc128594a26f7d1a8c71f58636" Mar 13 14:30:38 crc kubenswrapper[4898]: I0313 14:30:38.144862 4898 scope.go:117] "RemoveContainer" containerID="72a159d4ac7d8e712d9bffb75add791bdb6f8bee323ccb72d06431a42be68d77" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.135124 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.135700 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.135765 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.136745 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.136831 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8" gracePeriod=600 Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.412374 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8" exitCode=0 Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.412572 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8"} Mar 13 14:30:49 crc kubenswrapper[4898]: I0313 14:30:49.412666 4898 scope.go:117] "RemoveContainer" containerID="31ae8593afc62c01205d22940ebe7136407da3a6c051010917ba949bb52866cc" Mar 13 14:30:50 crc kubenswrapper[4898]: I0313 14:30:50.448126 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db"} Mar 13 14:31:06 crc kubenswrapper[4898]: I0313 14:31:06.051008 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:31:06 crc kubenswrapper[4898]: I0313 14:31:06.064262 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-hm77q"] Mar 13 14:31:07 crc kubenswrapper[4898]: I0313 14:31:07.759197 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664deedc-3946-4205-98ad-21759d35d952" path="/var/lib/kubelet/pods/664deedc-3946-4205-98ad-21759d35d952/volumes" Mar 13 14:31:13 crc kubenswrapper[4898]: I0313 14:31:13.041464 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:31:13 crc kubenswrapper[4898]: I0313 14:31:13.059487 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-dddqm"] Mar 13 14:31:13 crc kubenswrapper[4898]: I0313 14:31:13.758943 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a3e0c5-0084-4216-a162-3614eafcc162" path="/var/lib/kubelet/pods/51a3e0c5-0084-4216-a162-3614eafcc162/volumes" Mar 13 14:31:16 crc kubenswrapper[4898]: I0313 14:31:16.055231 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:31:16 crc kubenswrapper[4898]: I0313 14:31:16.070417 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ljct7"] Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.066946 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.080592 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xq6ss"] Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.759611 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f68a4dd-fec8-4e60-a89c-69ce09fc5700" path="/var/lib/kubelet/pods/0f68a4dd-fec8-4e60-a89c-69ce09fc5700/volumes" Mar 13 14:31:17 crc kubenswrapper[4898]: I0313 14:31:17.761725 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac704482-c7a4-471c-b3c1-d1fdd7e0eb83" path="/var/lib/kubelet/pods/ac704482-c7a4-471c-b3c1-d1fdd7e0eb83/volumes" Mar 13 14:31:30 crc kubenswrapper[4898]: I0313 14:31:30.202233 4898 generic.go:334] "Generic (PLEG): container finished" podID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerID="19401614e37155924147d2b68ea4d922bec3bbc6d22b8dd77602f823a33552a5" exitCode=0 Mar 13 14:31:30 crc kubenswrapper[4898]: I0313 14:31:30.202264 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerDied","Data":"19401614e37155924147d2b68ea4d922bec3bbc6d22b8dd77602f823a33552a5"} Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.876234 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.939504 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") pod \"05e315eb-34b1-4099-b676-b0238f3cb5c5\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.939920 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") pod \"05e315eb-34b1-4099-b676-b0238f3cb5c5\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.940071 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") pod \"05e315eb-34b1-4099-b676-b0238f3cb5c5\" (UID: \"05e315eb-34b1-4099-b676-b0238f3cb5c5\") " Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.957500 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j" (OuterVolumeSpecName: "kube-api-access-2v49j") pod "05e315eb-34b1-4099-b676-b0238f3cb5c5" (UID: "05e315eb-34b1-4099-b676-b0238f3cb5c5"). InnerVolumeSpecName "kube-api-access-2v49j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.975195 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory" (OuterVolumeSpecName: "inventory") pod "05e315eb-34b1-4099-b676-b0238f3cb5c5" (UID: "05e315eb-34b1-4099-b676-b0238f3cb5c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:31:31 crc kubenswrapper[4898]: I0313 14:31:31.986228 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05e315eb-34b1-4099-b676-b0238f3cb5c5" (UID: "05e315eb-34b1-4099-b676-b0238f3cb5c5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.043571 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.043677 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05e315eb-34b1-4099-b676-b0238f3cb5c5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.047495 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v49j\" (UniqueName: \"kubernetes.io/projected/05e315eb-34b1-4099-b676-b0238f3cb5c5-kube-api-access-2v49j\") on node \"crc\" DevicePath \"\"" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.228362 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" event={"ID":"05e315eb-34b1-4099-b676-b0238f3cb5c5","Type":"ContainerDied","Data":"56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468"} Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.228405 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56c75088356ea1f877eddaa52133cb1069449c5912814655f61681146f527468" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.228463 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-rbttg" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.327977 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn"] Mar 13 14:31:32 crc kubenswrapper[4898]: E0313 14:31:32.328670 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.328700 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: E0313 14:31:32.328721 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerName="oc" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.328732 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerName="oc" Mar 13 14:31:32 crc kubenswrapper[4898]: E0313 14:31:32.328811 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerName="collect-profiles" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.328824 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerName="collect-profiles" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.329204 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" containerName="oc" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.329241 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="05e315eb-34b1-4099-b676-b0238f3cb5c5" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.329253 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" containerName="collect-profiles" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.330387 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.344681 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn"] Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.344780 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.344804 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.345003 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.345007 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.456386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.456481 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.456543 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.557995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.558077 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.558244 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.563545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.573539 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.573587 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-57hwn\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:32 crc kubenswrapper[4898]: I0313 14:31:32.674082 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:31:33 crc kubenswrapper[4898]: I0313 14:31:33.360221 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn"] Mar 13 14:31:33 crc kubenswrapper[4898]: I0313 14:31:33.365761 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:31:34 crc kubenswrapper[4898]: I0313 14:31:34.250890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerStarted","Data":"dab0dfbc55bfc6f1df629f5c9477ac3d48d09670fceb17a5d6b9747c56387330"} Mar 13 14:31:34 crc kubenswrapper[4898]: I0313 14:31:34.251246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerStarted","Data":"8f16b5e15ed9fbaac5ba0b0d72bb41f2c1ee0e47dc35b9e1544ffaf57639e658"} Mar 13 14:31:34 crc kubenswrapper[4898]: I0313 14:31:34.292569 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" podStartSLOduration=1.868369886 podStartE2EDuration="2.292552014s" podCreationTimestamp="2026-03-13 14:31:32 +0000 UTC" firstStartedPulling="2026-03-13 14:31:33.365485973 +0000 UTC m=+2128.367074222" lastFinishedPulling="2026-03-13 14:31:33.789668111 +0000 UTC m=+2128.791256350" observedRunningTime="2026-03-13 14:31:34.285306285 +0000 UTC m=+2129.286894514" watchObservedRunningTime="2026-03-13 14:31:34.292552014 +0000 UTC m=+2129.294140253" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.737199 4898 scope.go:117] "RemoveContainer" containerID="dbe55f5873c440c467d8748cfaa995fee6ccd0abf9441bf1f70ed0dda90073d3" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.769908 4898 scope.go:117] "RemoveContainer" containerID="c66d5e607033edc265c5c4c3b44b5d453515d5500b6db940b367950853043279" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.795628 4898 scope.go:117] "RemoveContainer" containerID="27760265b5d44dc57e3a3eecff9d010cc5fc5af8472653848b227f366d4e7a49" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.848190 4898 scope.go:117] "RemoveContainer" containerID="d2f87292a607e1fb76b72fbf0fd5fba62057ee2d194e12f77b3db9510fddedf2" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.887610 4898 scope.go:117] "RemoveContainer" containerID="d1d45dfadc513f7c3e24c556ad6a45d1d7e050a0da971045c5c1181fc58bc3d2" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.935747 4898 scope.go:117] "RemoveContainer" containerID="c0655b2adb5618887bc26f0a3bb0d551a636cca41a03c1baf5cc0685920b55bb" Mar 13 14:31:38 crc kubenswrapper[4898]: I0313 14:31:38.981953 4898 scope.go:117] "RemoveContainer" containerID="e195371c387c0ec69bbadee68addfd45715bbfc83433b2ba3c47b307af7325bd" Mar 13 14:31:39 crc kubenswrapper[4898]: I0313 14:31:39.012279 4898 scope.go:117] "RemoveContainer" containerID="ededb2c682deb7693ab4f5295aba59c30d96d3df7afffb98859fdbc9fc5b9e13" Mar 13 14:31:43 crc kubenswrapper[4898]: I0313 14:31:43.059388 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:31:43 crc kubenswrapper[4898]: I0313 14:31:43.077734 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-ztp6c"] Mar 13 14:31:43 crc kubenswrapper[4898]: I0313 14:31:43.760258 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193b05da-acb9-4512-a2ae-6c03450e6f05" path="/var/lib/kubelet/pods/193b05da-acb9-4512-a2ae-6c03450e6f05/volumes" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.139738 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.142758 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.158557 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.184022 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.184288 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.184483 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.327738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"auto-csr-approver-29556872-2tqn4\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.430982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"auto-csr-approver-29556872-2tqn4\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.465119 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"auto-csr-approver-29556872-2tqn4\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:00 crc kubenswrapper[4898]: I0313 14:32:00.511953 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:01 crc kubenswrapper[4898]: I0313 14:32:01.051547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:32:01 crc kubenswrapper[4898]: W0313 14:32:01.058914 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8627002c_751e_4168_b294_4a324890a996.slice/crio-3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92 WatchSource:0}: Error finding container 3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92: Status 404 returned error can't find the container with id 3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92 Mar 13 14:32:01 crc kubenswrapper[4898]: I0313 14:32:01.619071 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" event={"ID":"8627002c-751e-4168-b294-4a324890a996","Type":"ContainerStarted","Data":"3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92"} Mar 13 14:32:02 crc kubenswrapper[4898]: I0313 14:32:02.635067 4898 generic.go:334] "Generic (PLEG): container finished" podID="8627002c-751e-4168-b294-4a324890a996" containerID="9918c054f17d0c467592f1c4b30fc11e333ea544aa38b84c1aab31d2beff7c97" exitCode=0 Mar 13 14:32:02 crc kubenswrapper[4898]: I0313 14:32:02.635130 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" event={"ID":"8627002c-751e-4168-b294-4a324890a996","Type":"ContainerDied","Data":"9918c054f17d0c467592f1c4b30fc11e333ea544aa38b84c1aab31d2beff7c97"} Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.146134 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.341796 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") pod \"8627002c-751e-4168-b294-4a324890a996\" (UID: \"8627002c-751e-4168-b294-4a324890a996\") " Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.351255 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k" (OuterVolumeSpecName: "kube-api-access-qq98k") pod "8627002c-751e-4168-b294-4a324890a996" (UID: "8627002c-751e-4168-b294-4a324890a996"). InnerVolumeSpecName "kube-api-access-qq98k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.445279 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq98k\" (UniqueName: \"kubernetes.io/projected/8627002c-751e-4168-b294-4a324890a996-kube-api-access-qq98k\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.675869 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" event={"ID":"8627002c-751e-4168-b294-4a324890a996","Type":"ContainerDied","Data":"3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92"} Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.676328 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e6c8b225663f1d3b600ffeac1d814f793ea23ad97ff301d21f7aa59626ffb92" Mar 13 14:32:04 crc kubenswrapper[4898]: I0313 14:32:04.675975 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556872-2tqn4" Mar 13 14:32:05 crc kubenswrapper[4898]: I0313 14:32:05.264176 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:32:05 crc kubenswrapper[4898]: I0313 14:32:05.291484 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556866-wnmcp"] Mar 13 14:32:05 crc kubenswrapper[4898]: I0313 14:32:05.759845 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45988deb-1057-4d89-a977-35978404b407" path="/var/lib/kubelet/pods/45988deb-1057-4d89-a977-35978404b407/volumes" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.088056 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.100703 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.117281 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.126036 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.134873 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-b9c7-account-create-update-l6h97"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.143946 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-978d-account-create-update-tkc22"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.153286 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qlgbl"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.161769 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-sqtw8"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.593098 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:24 crc kubenswrapper[4898]: E0313 14:32:24.593639 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8627002c-751e-4168-b294-4a324890a996" containerName="oc" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.593659 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8627002c-751e-4168-b294-4a324890a996" containerName="oc" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.593953 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8627002c-751e-4168-b294-4a324890a996" containerName="oc" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.596019 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.614655 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.742957 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.743030 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.743302 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.867545 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.867642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.867678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.868197 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.868214 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.901247 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"community-operators-9wh62\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:24 crc kubenswrapper[4898]: I0313 14:32:24.915846 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.049830 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.066952 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-zhx84"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.086673 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.095914 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-96c4-account-create-update-zsh5t"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.515326 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.766879 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="068b0856-126d-487c-9c1d-50299bf90d3a" path="/var/lib/kubelet/pods/068b0856-126d-487c-9c1d-50299bf90d3a/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.769070 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abedb18-bf27-42d9-b809-f7226b603a0d" path="/var/lib/kubelet/pods/1abedb18-bf27-42d9-b809-f7226b603a0d/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.770758 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29dbeb8a-611d-4513-a063-06d8f865ea93" path="/var/lib/kubelet/pods/29dbeb8a-611d-4513-a063-06d8f865ea93/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.779718 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f1f531-99d1-4b97-bd08-6bf94a7afd92" path="/var/lib/kubelet/pods/44f1f531-99d1-4b97-bd08-6bf94a7afd92/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.781773 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485200a5-cd75-45ac-b93a-b003158132c4" path="/var/lib/kubelet/pods/485200a5-cd75-45ac-b93a-b003158132c4/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.783583 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e516311e-fb5c-4901-aaf7-67793ffb5fa2" path="/var/lib/kubelet/pods/e516311e-fb5c-4901-aaf7-67793ffb5fa2/volumes" Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.974649 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d80d972-f659-45e5-9f39-015848cc4031" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" exitCode=0 Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.974708 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa"} Mar 13 14:32:25 crc kubenswrapper[4898]: I0313 14:32:25.974756 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerStarted","Data":"5b1d9d4e27dd0c848be0c41de4a99a6de77e56cfa4171f17318b70a2ad74984c"} Mar 13 14:32:26 crc kubenswrapper[4898]: I0313 14:32:26.989117 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerStarted","Data":"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110"} Mar 13 14:32:29 crc kubenswrapper[4898]: I0313 14:32:29.016791 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d80d972-f659-45e5-9f39-015848cc4031" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" exitCode=0 Mar 13 14:32:29 crc kubenswrapper[4898]: I0313 14:32:29.016928 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110"} Mar 13 14:32:30 crc kubenswrapper[4898]: I0313 14:32:30.029355 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerStarted","Data":"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1"} Mar 13 14:32:30 crc kubenswrapper[4898]: I0313 14:32:30.059660 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9wh62" podStartSLOduration=2.575778135 podStartE2EDuration="6.059628511s" podCreationTimestamp="2026-03-13 14:32:24 +0000 UTC" firstStartedPulling="2026-03-13 14:32:25.977295276 +0000 UTC m=+2180.978883515" lastFinishedPulling="2026-03-13 14:32:29.461145642 +0000 UTC m=+2184.462733891" observedRunningTime="2026-03-13 14:32:30.051849259 +0000 UTC m=+2185.053437528" watchObservedRunningTime="2026-03-13 14:32:30.059628511 +0000 UTC m=+2185.061216770" Mar 13 14:32:34 crc kubenswrapper[4898]: I0313 14:32:34.916401 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:34 crc kubenswrapper[4898]: I0313 14:32:34.916982 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:35 crc kubenswrapper[4898]: I0313 14:32:35.985865 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9wh62" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" probeResult="failure" output=< Mar 13 14:32:35 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:32:35 crc kubenswrapper[4898]: > Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.172960 4898 scope.go:117] "RemoveContainer" containerID="8dc6d76d86edf4ca1fbe4589c8cd285e45e3e3a632208ca7817a0099916d678e" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.203993 4898 scope.go:117] "RemoveContainer" containerID="7b608b8ed7e8f0c428a761cc552755850cceca5a0b694b7beffed50aa397bd7c" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.287716 4898 scope.go:117] "RemoveContainer" containerID="bf99d1df7658057773b414b4c2a04b114eb5afc6efdb7e3669f974780f737076" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.339121 4898 scope.go:117] "RemoveContainer" containerID="81721ca65e97448cfc7621215d98c3ea95987d960df48d9fdaef1b644754dcaf" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.431193 4898 scope.go:117] "RemoveContainer" containerID="388808695f9ab04c5365cf0b1e925d316f5fa781ee2d5e7df5edc4ae9b9090f3" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.458695 4898 scope.go:117] "RemoveContainer" containerID="990c87abba6cb97ddcc9ff9fc6fc009eeddad1ca5f265916404c317707eccb6f" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.527070 4898 scope.go:117] "RemoveContainer" containerID="c911be7c2d6d8f32598481ced3a29ce9fc65efe653b0696b937918e79b814d51" Mar 13 14:32:39 crc kubenswrapper[4898]: I0313 14:32:39.550252 4898 scope.go:117] "RemoveContainer" containerID="bb0c3bbe6081c4840c08e83355257eccaf898a1f20509cda6e941396919762d8" Mar 13 14:32:40 crc kubenswrapper[4898]: I0313 14:32:40.141960 4898 generic.go:334] "Generic (PLEG): container finished" podID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerID="dab0dfbc55bfc6f1df629f5c9477ac3d48d09670fceb17a5d6b9747c56387330" exitCode=0 Mar 13 14:32:40 crc kubenswrapper[4898]: I0313 14:32:40.142008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerDied","Data":"dab0dfbc55bfc6f1df629f5c9477ac3d48d09670fceb17a5d6b9747c56387330"} Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.721869 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.839570 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") pod \"295e7c32-75f1-4eee-a126-2d4547c56f24\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.839911 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") pod \"295e7c32-75f1-4eee-a126-2d4547c56f24\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.840011 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") pod \"295e7c32-75f1-4eee-a126-2d4547c56f24\" (UID: \"295e7c32-75f1-4eee-a126-2d4547c56f24\") " Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.846714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2" (OuterVolumeSpecName: "kube-api-access-pw4q2") pod "295e7c32-75f1-4eee-a126-2d4547c56f24" (UID: "295e7c32-75f1-4eee-a126-2d4547c56f24"). InnerVolumeSpecName "kube-api-access-pw4q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.873125 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory" (OuterVolumeSpecName: "inventory") pod "295e7c32-75f1-4eee-a126-2d4547c56f24" (UID: "295e7c32-75f1-4eee-a126-2d4547c56f24"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.881308 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "295e7c32-75f1-4eee-a126-2d4547c56f24" (UID: "295e7c32-75f1-4eee-a126-2d4547c56f24"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.943451 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.943488 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/295e7c32-75f1-4eee-a126-2d4547c56f24-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:41 crc kubenswrapper[4898]: I0313 14:32:41.943499 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw4q2\" (UniqueName: \"kubernetes.io/projected/295e7c32-75f1-4eee-a126-2d4547c56f24-kube-api-access-pw4q2\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.169733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" event={"ID":"295e7c32-75f1-4eee-a126-2d4547c56f24","Type":"ContainerDied","Data":"8f16b5e15ed9fbaac5ba0b0d72bb41f2c1ee0e47dc35b9e1544ffaf57639e658"} Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.169782 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f16b5e15ed9fbaac5ba0b0d72bb41f2c1ee0e47dc35b9e1544ffaf57639e658" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.169807 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-57hwn" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.329586 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c"] Mar 13 14:32:42 crc kubenswrapper[4898]: E0313 14:32:42.330717 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.330832 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.331216 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="295e7c32-75f1-4eee-a126-2d4547c56f24" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.332344 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.334363 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.336421 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.336470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.337270 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.348109 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c"] Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.458102 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.458148 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.459037 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.561655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.561730 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.561759 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.567720 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.568130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.579689 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:42 crc kubenswrapper[4898]: I0313 14:32:42.660116 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:43 crc kubenswrapper[4898]: W0313 14:32:43.359785 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod271e9163_4e9c_4c79_a0b4_be373e97956c.slice/crio-4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc WatchSource:0}: Error finding container 4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc: Status 404 returned error can't find the container with id 4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc Mar 13 14:32:43 crc kubenswrapper[4898]: I0313 14:32:43.363788 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c"] Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.194698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerStarted","Data":"9a2d91dd6fb576c0d414bdb1108d49b85beeded77fb65ea17ffcfe9b08112c4c"} Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.195315 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerStarted","Data":"4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc"} Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.228070 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" podStartSLOduration=1.764817643 podStartE2EDuration="2.228040034s" podCreationTimestamp="2026-03-13 14:32:42 +0000 UTC" firstStartedPulling="2026-03-13 14:32:43.36269529 +0000 UTC m=+2198.364283529" lastFinishedPulling="2026-03-13 14:32:43.825917641 +0000 UTC m=+2198.827505920" observedRunningTime="2026-03-13 14:32:44.211055513 +0000 UTC m=+2199.212643792" watchObservedRunningTime="2026-03-13 14:32:44.228040034 +0000 UTC m=+2199.229628313" Mar 13 14:32:44 crc kubenswrapper[4898]: I0313 14:32:44.982502 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:45 crc kubenswrapper[4898]: I0313 14:32:45.055159 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:45 crc kubenswrapper[4898]: I0313 14:32:45.226226 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.216218 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9wh62" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" containerID="cri-o://30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" gracePeriod=2 Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.828717 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.976441 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") pod \"3d80d972-f659-45e5-9f39-015848cc4031\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.976552 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") pod \"3d80d972-f659-45e5-9f39-015848cc4031\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.976595 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") pod \"3d80d972-f659-45e5-9f39-015848cc4031\" (UID: \"3d80d972-f659-45e5-9f39-015848cc4031\") " Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.977521 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities" (OuterVolumeSpecName: "utilities") pod "3d80d972-f659-45e5-9f39-015848cc4031" (UID: "3d80d972-f659-45e5-9f39-015848cc4031"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:32:46 crc kubenswrapper[4898]: I0313 14:32:46.982605 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng" (OuterVolumeSpecName: "kube-api-access-92zng") pod "3d80d972-f659-45e5-9f39-015848cc4031" (UID: "3d80d972-f659-45e5-9f39-015848cc4031"). InnerVolumeSpecName "kube-api-access-92zng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.041620 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3d80d972-f659-45e5-9f39-015848cc4031" (UID: "3d80d972-f659-45e5-9f39-015848cc4031"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.079153 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92zng\" (UniqueName: \"kubernetes.io/projected/3d80d972-f659-45e5-9f39-015848cc4031-kube-api-access-92zng\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.079190 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.079199 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d80d972-f659-45e5-9f39-015848cc4031-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.226930 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d80d972-f659-45e5-9f39-015848cc4031" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" exitCode=0 Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.226995 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9wh62" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.227014 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1"} Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.228203 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9wh62" event={"ID":"3d80d972-f659-45e5-9f39-015848cc4031","Type":"ContainerDied","Data":"5b1d9d4e27dd0c848be0c41de4a99a6de77e56cfa4171f17318b70a2ad74984c"} Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.228236 4898 scope.go:117] "RemoveContainer" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.265846 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.275652 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9wh62"] Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.278570 4898 scope.go:117] "RemoveContainer" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.304117 4898 scope.go:117] "RemoveContainer" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.365302 4898 scope.go:117] "RemoveContainer" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" Mar 13 14:32:47 crc kubenswrapper[4898]: E0313 14:32:47.365830 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1\": container with ID starting with 30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1 not found: ID does not exist" containerID="30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.365862 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1"} err="failed to get container status \"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1\": rpc error: code = NotFound desc = could not find container \"30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1\": container with ID starting with 30db5e1021782902f185770668492e533f882be7ceee4ca587e31e93aa387af1 not found: ID does not exist" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.365883 4898 scope.go:117] "RemoveContainer" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" Mar 13 14:32:47 crc kubenswrapper[4898]: E0313 14:32:47.366510 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110\": container with ID starting with ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110 not found: ID does not exist" containerID="ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.366557 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110"} err="failed to get container status \"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110\": rpc error: code = NotFound desc = could not find container \"ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110\": container with ID starting with ddb4d6866d3c013ad2c724465ffc1a52eb5ee01c575b64ad3b6b88024322b110 not found: ID does not exist" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.366589 4898 scope.go:117] "RemoveContainer" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" Mar 13 14:32:47 crc kubenswrapper[4898]: E0313 14:32:47.367047 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa\": container with ID starting with 3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa not found: ID does not exist" containerID="3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.367134 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa"} err="failed to get container status \"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa\": rpc error: code = NotFound desc = could not find container \"3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa\": container with ID starting with 3da8a8d11477b1924c1b6562d9af682ad56c935f3f8bbb3b3c39c2fc17706caa not found: ID does not exist" Mar 13 14:32:47 crc kubenswrapper[4898]: I0313 14:32:47.752228 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d80d972-f659-45e5-9f39-015848cc4031" path="/var/lib/kubelet/pods/3d80d972-f659-45e5-9f39-015848cc4031/volumes" Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.134310 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.134674 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.254795 4898 generic.go:334] "Generic (PLEG): container finished" podID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerID="9a2d91dd6fb576c0d414bdb1108d49b85beeded77fb65ea17ffcfe9b08112c4c" exitCode=0 Mar 13 14:32:49 crc kubenswrapper[4898]: I0313 14:32:49.254839 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerDied","Data":"9a2d91dd6fb576c0d414bdb1108d49b85beeded77fb65ea17ffcfe9b08112c4c"} Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.851124 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.993304 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") pod \"271e9163-4e9c-4c79-a0b4-be373e97956c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.993357 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") pod \"271e9163-4e9c-4c79-a0b4-be373e97956c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " Mar 13 14:32:50 crc kubenswrapper[4898]: I0313 14:32:50.993427 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") pod \"271e9163-4e9c-4c79-a0b4-be373e97956c\" (UID: \"271e9163-4e9c-4c79-a0b4-be373e97956c\") " Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.003214 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f" (OuterVolumeSpecName: "kube-api-access-4wj8f") pod "271e9163-4e9c-4c79-a0b4-be373e97956c" (UID: "271e9163-4e9c-4c79-a0b4-be373e97956c"). InnerVolumeSpecName "kube-api-access-4wj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.032357 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory" (OuterVolumeSpecName: "inventory") pod "271e9163-4e9c-4c79-a0b4-be373e97956c" (UID: "271e9163-4e9c-4c79-a0b4-be373e97956c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.038651 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "271e9163-4e9c-4c79-a0b4-be373e97956c" (UID: "271e9163-4e9c-4c79-a0b4-be373e97956c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.096601 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.096635 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/271e9163-4e9c-4c79-a0b4-be373e97956c-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.096644 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wj8f\" (UniqueName: \"kubernetes.io/projected/271e9163-4e9c-4c79-a0b4-be373e97956c-kube-api-access-4wj8f\") on node \"crc\" DevicePath \"\"" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.281267 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" event={"ID":"271e9163-4e9c-4c79-a0b4-be373e97956c","Type":"ContainerDied","Data":"4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc"} Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.281320 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4462e1eeacdaca47ef14f2208ad82e41bbd408e210fbc71a917708ba15842dcc" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.281335 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.362479 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644"] Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363149 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363180 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-content" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363187 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-content" Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363224 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-utilities" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363232 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="extract-utilities" Mar 13 14:32:51 crc kubenswrapper[4898]: E0313 14:32:51.363261 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363270 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363575 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d80d972-f659-45e5-9f39-015848cc4031" containerName="registry-server" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.363622 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="271e9163-4e9c-4c79-a0b4-be373e97956c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.364732 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.366868 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.367049 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.367259 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.368680 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.374792 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644"] Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.508993 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.509438 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.509837 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.612661 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.612727 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.612789 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.616608 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.616915 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.631036 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tr644\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:51 crc kubenswrapper[4898]: I0313 14:32:51.703797 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:32:52 crc kubenswrapper[4898]: I0313 14:32:52.447573 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644"] Mar 13 14:32:53 crc kubenswrapper[4898]: I0313 14:32:53.303602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerStarted","Data":"4c566f53698dc8ae77e0ae8b01f4afc7b8880694d922ac5eedd84e461b0b86f6"} Mar 13 14:32:53 crc kubenswrapper[4898]: I0313 14:32:53.304059 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerStarted","Data":"f1afaa6de53cd09356160f6667e20e603147bc41c6c165a1557006dc66536ae1"} Mar 13 14:32:53 crc kubenswrapper[4898]: I0313 14:32:53.332504 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" podStartSLOduration=1.8176499019999999 podStartE2EDuration="2.332477537s" podCreationTimestamp="2026-03-13 14:32:51 +0000 UTC" firstStartedPulling="2026-03-13 14:32:52.468415668 +0000 UTC m=+2207.470003927" lastFinishedPulling="2026-03-13 14:32:52.983243283 +0000 UTC m=+2207.984831562" observedRunningTime="2026-03-13 14:32:53.323033799 +0000 UTC m=+2208.324622028" watchObservedRunningTime="2026-03-13 14:32:53.332477537 +0000 UTC m=+2208.334065786" Mar 13 14:32:58 crc kubenswrapper[4898]: I0313 14:32:58.052144 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:32:58 crc kubenswrapper[4898]: I0313 14:32:58.064281 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qps2v"] Mar 13 14:32:59 crc kubenswrapper[4898]: I0313 14:32:59.752796 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eba407c-68a5-45e9-ab51-e8cba05d8559" path="/var/lib/kubelet/pods/7eba407c-68a5-45e9-ab51-e8cba05d8559/volumes" Mar 13 14:33:19 crc kubenswrapper[4898]: I0313 14:33:19.135163 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:33:19 crc kubenswrapper[4898]: I0313 14:33:19.135840 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.060464 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.083867 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.096887 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-76lrv"] Mar 13 14:33:20 crc kubenswrapper[4898]: I0313 14:33:20.110380 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-llbn5"] Mar 13 14:33:21 crc kubenswrapper[4898]: I0313 14:33:21.757868 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04183e35-79b0-4c76-b538-b5b71299cd92" path="/var/lib/kubelet/pods/04183e35-79b0-4c76-b538-b5b71299cd92/volumes" Mar 13 14:33:21 crc kubenswrapper[4898]: I0313 14:33:21.759479 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff908e4-09f4-490b-9b9c-ef65c6224eeb" path="/var/lib/kubelet/pods/bff908e4-09f4-490b-9b9c-ef65c6224eeb/volumes" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.432096 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.436466 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.449486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.517798 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.517871 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.518458 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.623038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.623630 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.623686 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.624891 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.625232 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.646417 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"redhat-operators-mt6t6\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:28 crc kubenswrapper[4898]: I0313 14:33:28.779887 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:29 crc kubenswrapper[4898]: I0313 14:33:29.327350 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:33:30 crc kubenswrapper[4898]: I0313 14:33:30.019206 4898 generic.go:334] "Generic (PLEG): container finished" podID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" exitCode=0 Mar 13 14:33:30 crc kubenswrapper[4898]: I0313 14:33:30.019260 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210"} Mar 13 14:33:30 crc kubenswrapper[4898]: I0313 14:33:30.019445 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerStarted","Data":"b30638434c0fe439393ecdc839cda22c240c59580f70c0f1734ebb6f4ce66486"} Mar 13 14:33:31 crc kubenswrapper[4898]: I0313 14:33:31.038596 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerStarted","Data":"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916"} Mar 13 14:33:32 crc kubenswrapper[4898]: I0313 14:33:32.052762 4898 generic.go:334] "Generic (PLEG): container finished" podID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerID="4c566f53698dc8ae77e0ae8b01f4afc7b8880694d922ac5eedd84e461b0b86f6" exitCode=0 Mar 13 14:33:32 crc kubenswrapper[4898]: I0313 14:33:32.052816 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerDied","Data":"4c566f53698dc8ae77e0ae8b01f4afc7b8880694d922ac5eedd84e461b0b86f6"} Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.676677 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.769423 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") pod \"8b4abb6a-5797-47be-96a0-69173649e5fa\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.769669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") pod \"8b4abb6a-5797-47be-96a0-69173649e5fa\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.769797 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") pod \"8b4abb6a-5797-47be-96a0-69173649e5fa\" (UID: \"8b4abb6a-5797-47be-96a0-69173649e5fa\") " Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.776178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc" (OuterVolumeSpecName: "kube-api-access-szptc") pod "8b4abb6a-5797-47be-96a0-69173649e5fa" (UID: "8b4abb6a-5797-47be-96a0-69173649e5fa"). InnerVolumeSpecName "kube-api-access-szptc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.807762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b4abb6a-5797-47be-96a0-69173649e5fa" (UID: "8b4abb6a-5797-47be-96a0-69173649e5fa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.812972 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory" (OuterVolumeSpecName: "inventory") pod "8b4abb6a-5797-47be-96a0-69173649e5fa" (UID: "8b4abb6a-5797-47be-96a0-69173649e5fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.873794 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.874913 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szptc\" (UniqueName: \"kubernetes.io/projected/8b4abb6a-5797-47be-96a0-69173649e5fa-kube-api-access-szptc\") on node \"crc\" DevicePath \"\"" Mar 13 14:33:33 crc kubenswrapper[4898]: I0313 14:33:33.874932 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b4abb6a-5797-47be-96a0-69173649e5fa-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.075709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" event={"ID":"8b4abb6a-5797-47be-96a0-69173649e5fa","Type":"ContainerDied","Data":"f1afaa6de53cd09356160f6667e20e603147bc41c6c165a1557006dc66536ae1"} Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.075756 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1afaa6de53cd09356160f6667e20e603147bc41c6c165a1557006dc66536ae1" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.075856 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tr644" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.210697 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz"] Mar 13 14:33:34 crc kubenswrapper[4898]: E0313 14:33:34.211433 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.211470 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.211787 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b4abb6a-5797-47be-96a0-69173649e5fa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.212946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.215922 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.215971 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.216002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.215926 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.226844 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz"] Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.286114 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.286215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.286249 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.389786 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.389933 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.389978 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.396130 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.398334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.414243 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:34 crc kubenswrapper[4898]: I0313 14:33:34.538528 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:33:35 crc kubenswrapper[4898]: I0313 14:33:35.068796 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz"] Mar 13 14:33:35 crc kubenswrapper[4898]: W0313 14:33:35.076759 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac094822_6272_4730_ab0b_16f0116426b5.slice/crio-704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d WatchSource:0}: Error finding container 704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d: Status 404 returned error can't find the container with id 704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.033884 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.047366 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-crnr4"] Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.101050 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerStarted","Data":"75351a338e412d11bdcadf255cb40c23212372a818b23c31d819578fbd7526fe"} Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.101121 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerStarted","Data":"704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d"} Mar 13 14:33:36 crc kubenswrapper[4898]: I0313 14:33:36.125883 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" podStartSLOduration=1.581065421 podStartE2EDuration="2.125857591s" podCreationTimestamp="2026-03-13 14:33:34 +0000 UTC" firstStartedPulling="2026-03-13 14:33:35.085177975 +0000 UTC m=+2250.086766224" lastFinishedPulling="2026-03-13 14:33:35.629970145 +0000 UTC m=+2250.631558394" observedRunningTime="2026-03-13 14:33:36.121334367 +0000 UTC m=+2251.122922616" watchObservedRunningTime="2026-03-13 14:33:36.125857591 +0000 UTC m=+2251.127445840" Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.037876 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.060014 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-fc6a-account-create-update-cfl2q"] Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.115685 4898 generic.go:334] "Generic (PLEG): container finished" podID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" exitCode=0 Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.115759 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916"} Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.765019 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d29378e-424d-4831-baf4-b59a75072097" path="/var/lib/kubelet/pods/4d29378e-424d-4831-baf4-b59a75072097/volumes" Mar 13 14:33:37 crc kubenswrapper[4898]: I0313 14:33:37.766242 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e50eec10-99ce-4611-8cf4-8f4999146339" path="/var/lib/kubelet/pods/e50eec10-99ce-4611-8cf4-8f4999146339/volumes" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.131186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerStarted","Data":"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f"} Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.170548 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mt6t6" podStartSLOduration=2.519290723 podStartE2EDuration="10.170520487s" podCreationTimestamp="2026-03-13 14:33:28 +0000 UTC" firstStartedPulling="2026-03-13 14:33:30.022206457 +0000 UTC m=+2245.023794696" lastFinishedPulling="2026-03-13 14:33:37.673436181 +0000 UTC m=+2252.675024460" observedRunningTime="2026-03-13 14:33:38.158832872 +0000 UTC m=+2253.160421131" watchObservedRunningTime="2026-03-13 14:33:38.170520487 +0000 UTC m=+2253.172108736" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.781461 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.781860 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.816181 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.822689 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.837027 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.921617 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.921722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:38 crc kubenswrapper[4898]: I0313 14:33:38.921784 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.024513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.024703 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.024754 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.025088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.025769 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.053642 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"redhat-marketplace-88s69\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.167159 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.674418 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.745702 4898 scope.go:117] "RemoveContainer" containerID="4c3017aeb6114b905e5e159773ee55011a6eea4d13e889167f127e1524ada66d" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.814672 4898 scope.go:117] "RemoveContainer" containerID="74e75aa197a91664f89edd48f01ff5c813660e7743cf922315f4b6a18e19c506" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.839799 4898 scope.go:117] "RemoveContainer" containerID="903eebacfbe4709488e6b56c6ba47deec7c9d806d35c4763b770e46f79ef165a" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.883491 4898 scope.go:117] "RemoveContainer" containerID="d1ff8d0ca102a074d68ee12cd37ffd04a070a172037a36f9afafa4bd84128371" Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.886579 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:39 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:39 crc kubenswrapper[4898]: > Mar 13 14:33:39 crc kubenswrapper[4898]: I0313 14:33:39.938805 4898 scope.go:117] "RemoveContainer" containerID="7e9f2307a91699c726a3f93d044663fb844450acd6da5dd38c51549451b97bc8" Mar 13 14:33:40 crc kubenswrapper[4898]: I0313 14:33:40.153138 4898 generic.go:334] "Generic (PLEG): container finished" podID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" exitCode=0 Mar 13 14:33:40 crc kubenswrapper[4898]: I0313 14:33:40.153257 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81"} Mar 13 14:33:40 crc kubenswrapper[4898]: I0313 14:33:40.153588 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerStarted","Data":"3e76bb401490d4fcb76c9945e01e87306f712102e853a1d3e262b2dbb4c6cc18"} Mar 13 14:33:41 crc kubenswrapper[4898]: I0313 14:33:41.172016 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerStarted","Data":"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36"} Mar 13 14:33:43 crc kubenswrapper[4898]: I0313 14:33:43.205435 4898 generic.go:334] "Generic (PLEG): container finished" podID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" exitCode=0 Mar 13 14:33:43 crc kubenswrapper[4898]: I0313 14:33:43.205498 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36"} Mar 13 14:33:44 crc kubenswrapper[4898]: I0313 14:33:44.220415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerStarted","Data":"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15"} Mar 13 14:33:44 crc kubenswrapper[4898]: I0313 14:33:44.246786 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88s69" podStartSLOduration=2.523557377 podStartE2EDuration="6.246767749s" podCreationTimestamp="2026-03-13 14:33:38 +0000 UTC" firstStartedPulling="2026-03-13 14:33:40.154957014 +0000 UTC m=+2255.156545253" lastFinishedPulling="2026-03-13 14:33:43.878167376 +0000 UTC m=+2258.879755625" observedRunningTime="2026-03-13 14:33:44.240133202 +0000 UTC m=+2259.241721461" watchObservedRunningTime="2026-03-13 14:33:44.246767749 +0000 UTC m=+2259.248355998" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.134177 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.134752 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.134802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.135745 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.135804 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" gracePeriod=600 Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.168316 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.168491 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:49 crc kubenswrapper[4898]: E0313 14:33:49.270534 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.282347 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" exitCode=0 Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.282338 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db"} Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.282432 4898 scope.go:117] "RemoveContainer" containerID="544b53b5cdd0293005863b343628de53b83869ce5cd2c798b19c01abba2b5bc8" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.283255 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:33:49 crc kubenswrapper[4898]: E0313 14:33:49.283684 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:33:49 crc kubenswrapper[4898]: I0313 14:33:49.863700 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:49 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:49 crc kubenswrapper[4898]: > Mar 13 14:33:50 crc kubenswrapper[4898]: I0313 14:33:50.234473 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-88s69" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:50 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:50 crc kubenswrapper[4898]: > Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.243654 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.307784 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.640803 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:33:59 crc kubenswrapper[4898]: I0313 14:33:59.828297 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:33:59 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:33:59 crc kubenswrapper[4898]: > Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.139671 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.141853 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.144128 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.145303 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.154890 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.164842 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.221724 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"auto-csr-approver-29556874-5w2n9\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.323929 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"auto-csr-approver-29556874-5w2n9\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.353789 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"auto-csr-approver-29556874-5w2n9\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.421106 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88s69" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" containerID="cri-o://77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" gracePeriod=2 Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.467824 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:00 crc kubenswrapper[4898]: I0313 14:34:00.739410 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:00 crc kubenswrapper[4898]: E0313 14:34:00.739999 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.026217 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:34:01 crc kubenswrapper[4898]: W0313 14:34:01.034040 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb068c44_8492_4ed4_973b_f1233d9db645.slice/crio-800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db WatchSource:0}: Error finding container 800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db: Status 404 returned error can't find the container with id 800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.041082 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.146593 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") pod \"0528f01c-62c6-4665-9b64-b20182ed6aad\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.146880 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") pod \"0528f01c-62c6-4665-9b64-b20182ed6aad\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.146962 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") pod \"0528f01c-62c6-4665-9b64-b20182ed6aad\" (UID: \"0528f01c-62c6-4665-9b64-b20182ed6aad\") " Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.149165 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities" (OuterVolumeSpecName: "utilities") pod "0528f01c-62c6-4665-9b64-b20182ed6aad" (UID: "0528f01c-62c6-4665-9b64-b20182ed6aad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.154865 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc" (OuterVolumeSpecName: "kube-api-access-ncrlc") pod "0528f01c-62c6-4665-9b64-b20182ed6aad" (UID: "0528f01c-62c6-4665-9b64-b20182ed6aad"). InnerVolumeSpecName "kube-api-access-ncrlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.175256 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0528f01c-62c6-4665-9b64-b20182ed6aad" (UID: "0528f01c-62c6-4665-9b64-b20182ed6aad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.249971 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.250025 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncrlc\" (UniqueName: \"kubernetes.io/projected/0528f01c-62c6-4665-9b64-b20182ed6aad-kube-api-access-ncrlc\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.250049 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0528f01c-62c6-4665-9b64-b20182ed6aad-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.435226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerStarted","Data":"800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db"} Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441496 4898 generic.go:334] "Generic (PLEG): container finished" podID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" exitCode=0 Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441627 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15"} Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441657 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88s69" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441687 4898 scope.go:117] "RemoveContainer" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.441673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88s69" event={"ID":"0528f01c-62c6-4665-9b64-b20182ed6aad","Type":"ContainerDied","Data":"3e76bb401490d4fcb76c9945e01e87306f712102e853a1d3e262b2dbb4c6cc18"} Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.471641 4898 scope.go:117] "RemoveContainer" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.501620 4898 scope.go:117] "RemoveContainer" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.501851 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.524495 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88s69"] Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.567545 4898 scope.go:117] "RemoveContainer" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" Mar 13 14:34:01 crc kubenswrapper[4898]: E0313 14:34:01.568228 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15\": container with ID starting with 77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15 not found: ID does not exist" containerID="77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568364 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15"} err="failed to get container status \"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15\": rpc error: code = NotFound desc = could not find container \"77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15\": container with ID starting with 77a925a30178dab323a899e7263be15fe8380315d4020f9d292c755934dc7a15 not found: ID does not exist" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568408 4898 scope.go:117] "RemoveContainer" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" Mar 13 14:34:01 crc kubenswrapper[4898]: E0313 14:34:01.568755 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36\": container with ID starting with 68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36 not found: ID does not exist" containerID="68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568803 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36"} err="failed to get container status \"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36\": rpc error: code = NotFound desc = could not find container \"68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36\": container with ID starting with 68cc17e658b3ea1ee0994d975dfdcde55ffedb9efd2ab7cfbcb04e5d5f03ee36 not found: ID does not exist" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.568830 4898 scope.go:117] "RemoveContainer" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" Mar 13 14:34:01 crc kubenswrapper[4898]: E0313 14:34:01.569160 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81\": container with ID starting with 1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81 not found: ID does not exist" containerID="1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.569204 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81"} err="failed to get container status \"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81\": rpc error: code = NotFound desc = could not find container \"1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81\": container with ID starting with 1a058ddfeeccde23c6d339f5541c08affddacfb101ea296c75f47ab9c087ad81 not found: ID does not exist" Mar 13 14:34:01 crc kubenswrapper[4898]: I0313 14:34:01.754272 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" path="/var/lib/kubelet/pods/0528f01c-62c6-4665-9b64-b20182ed6aad/volumes" Mar 13 14:34:02 crc kubenswrapper[4898]: I0313 14:34:02.453928 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerStarted","Data":"408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f"} Mar 13 14:34:02 crc kubenswrapper[4898]: I0313 14:34:02.475716 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" podStartSLOduration=1.441785961 podStartE2EDuration="2.475691867s" podCreationTimestamp="2026-03-13 14:34:00 +0000 UTC" firstStartedPulling="2026-03-13 14:34:01.036457092 +0000 UTC m=+2276.038045331" lastFinishedPulling="2026-03-13 14:34:02.070362998 +0000 UTC m=+2277.071951237" observedRunningTime="2026-03-13 14:34:02.471810099 +0000 UTC m=+2277.473398378" watchObservedRunningTime="2026-03-13 14:34:02.475691867 +0000 UTC m=+2277.477280136" Mar 13 14:34:03 crc kubenswrapper[4898]: I0313 14:34:03.469576 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb068c44-8492-4ed4-973b-f1233d9db645" containerID="408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f" exitCode=0 Mar 13 14:34:03 crc kubenswrapper[4898]: I0313 14:34:03.470167 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerDied","Data":"408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f"} Mar 13 14:34:04 crc kubenswrapper[4898]: I0313 14:34:04.961083 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.051134 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") pod \"eb068c44-8492-4ed4-973b-f1233d9db645\" (UID: \"eb068c44-8492-4ed4-973b-f1233d9db645\") " Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.077063 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h" (OuterVolumeSpecName: "kube-api-access-w9g5h") pod "eb068c44-8492-4ed4-973b-f1233d9db645" (UID: "eb068c44-8492-4ed4-973b-f1233d9db645"). InnerVolumeSpecName "kube-api-access-w9g5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.086364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.105466 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tqzkv"] Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.158173 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9g5h\" (UniqueName: \"kubernetes.io/projected/eb068c44-8492-4ed4-973b-f1233d9db645-kube-api-access-w9g5h\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.500194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" event={"ID":"eb068c44-8492-4ed4-973b-f1233d9db645","Type":"ContainerDied","Data":"800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db"} Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.500232 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="800944d00480e2fc3b6f3f4e98fb3f11048a2d231376dd91c90d61f1dde663db" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.500309 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556874-5w2n9" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.757545 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53" path="/var/lib/kubelet/pods/bbb3eb7a-2c0d-42d3-9d61-b3ae21863f53/volumes" Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.836126 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:34:05 crc kubenswrapper[4898]: I0313 14:34:05.847735 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556868-9gt46"] Mar 13 14:34:07 crc kubenswrapper[4898]: I0313 14:34:07.766353 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9565fbbb-2765-4ffb-a934-e5ddf9be1d17" path="/var/lib/kubelet/pods/9565fbbb-2765-4ffb-a934-e5ddf9be1d17/volumes" Mar 13 14:34:08 crc kubenswrapper[4898]: I0313 14:34:08.831907 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:08 crc kubenswrapper[4898]: I0313 14:34:08.904068 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:10 crc kubenswrapper[4898]: I0313 14:34:10.010238 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:34:10 crc kubenswrapper[4898]: I0313 14:34:10.563081 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mt6t6" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" containerID="cri-o://14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" gracePeriod=2 Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.122215 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.225064 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") pod \"5a1116c6-c423-4585-af50-c9ecdca3720e\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.225114 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") pod \"5a1116c6-c423-4585-af50-c9ecdca3720e\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.225198 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") pod \"5a1116c6-c423-4585-af50-c9ecdca3720e\" (UID: \"5a1116c6-c423-4585-af50-c9ecdca3720e\") " Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.226229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities" (OuterVolumeSpecName: "utilities") pod "5a1116c6-c423-4585-af50-c9ecdca3720e" (UID: "5a1116c6-c423-4585-af50-c9ecdca3720e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.234041 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp" (OuterVolumeSpecName: "kube-api-access-68xrp") pod "5a1116c6-c423-4585-af50-c9ecdca3720e" (UID: "5a1116c6-c423-4585-af50-c9ecdca3720e"). InnerVolumeSpecName "kube-api-access-68xrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.329082 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.329139 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68xrp\" (UniqueName: \"kubernetes.io/projected/5a1116c6-c423-4585-af50-c9ecdca3720e-kube-api-access-68xrp\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.353133 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a1116c6-c423-4585-af50-c9ecdca3720e" (UID: "5a1116c6-c423-4585-af50-c9ecdca3720e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.431653 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1116c6-c423-4585-af50-c9ecdca3720e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575126 4898 generic.go:334] "Generic (PLEG): container finished" podID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" exitCode=0 Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575182 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mt6t6" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f"} Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575268 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mt6t6" event={"ID":"5a1116c6-c423-4585-af50-c9ecdca3720e","Type":"ContainerDied","Data":"b30638434c0fe439393ecdc839cda22c240c59580f70c0f1734ebb6f4ce66486"} Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.575300 4898 scope.go:117] "RemoveContainer" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.603411 4898 scope.go:117] "RemoveContainer" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.655177 4898 scope.go:117] "RemoveContainer" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.661393 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.671726 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mt6t6"] Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.713718 4898 scope.go:117] "RemoveContainer" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" Mar 13 14:34:11 crc kubenswrapper[4898]: E0313 14:34:11.714226 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f\": container with ID starting with 14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f not found: ID does not exist" containerID="14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714256 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f"} err="failed to get container status \"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f\": rpc error: code = NotFound desc = could not find container \"14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f\": container with ID starting with 14c1d4733085fbbb2a211aef675271290bff0540a83ab98d97529e0f6e0ef44f not found: ID does not exist" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714287 4898 scope.go:117] "RemoveContainer" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" Mar 13 14:34:11 crc kubenswrapper[4898]: E0313 14:34:11.714614 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916\": container with ID starting with 7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916 not found: ID does not exist" containerID="7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714667 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916"} err="failed to get container status \"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916\": rpc error: code = NotFound desc = could not find container \"7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916\": container with ID starting with 7db8dd08296576b12cfca554683282ed44b44eb4b878ac17306195ad12f12916 not found: ID does not exist" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.714705 4898 scope.go:117] "RemoveContainer" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" Mar 13 14:34:11 crc kubenswrapper[4898]: E0313 14:34:11.715116 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210\": container with ID starting with affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210 not found: ID does not exist" containerID="affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.715176 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210"} err="failed to get container status \"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210\": rpc error: code = NotFound desc = could not find container \"affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210\": container with ID starting with affc0f5381f8e5277306c781cff3a6263e539776e352fbab467b081645fab210 not found: ID does not exist" Mar 13 14:34:11 crc kubenswrapper[4898]: I0313 14:34:11.751270 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" path="/var/lib/kubelet/pods/5a1116c6-c423-4585-af50-c9ecdca3720e/volumes" Mar 13 14:34:12 crc kubenswrapper[4898]: I0313 14:34:12.740427 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:12 crc kubenswrapper[4898]: E0313 14:34:12.741134 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:24 crc kubenswrapper[4898]: I0313 14:34:24.764262 4898 generic.go:334] "Generic (PLEG): container finished" podID="ac094822-6272-4730-ab0b-16f0116426b5" containerID="75351a338e412d11bdcadf255cb40c23212372a818b23c31d819578fbd7526fe" exitCode=0 Mar 13 14:34:24 crc kubenswrapper[4898]: I0313 14:34:24.765219 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerDied","Data":"75351a338e412d11bdcadf255cb40c23212372a818b23c31d819578fbd7526fe"} Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.386982 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.479810 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") pod \"ac094822-6272-4730-ab0b-16f0116426b5\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.479992 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") pod \"ac094822-6272-4730-ab0b-16f0116426b5\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.480156 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") pod \"ac094822-6272-4730-ab0b-16f0116426b5\" (UID: \"ac094822-6272-4730-ab0b-16f0116426b5\") " Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.487163 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5" (OuterVolumeSpecName: "kube-api-access-zsqr5") pod "ac094822-6272-4730-ab0b-16f0116426b5" (UID: "ac094822-6272-4730-ab0b-16f0116426b5"). InnerVolumeSpecName "kube-api-access-zsqr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.519476 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory" (OuterVolumeSpecName: "inventory") pod "ac094822-6272-4730-ab0b-16f0116426b5" (UID: "ac094822-6272-4730-ab0b-16f0116426b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.522761 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac094822-6272-4730-ab0b-16f0116426b5" (UID: "ac094822-6272-4730-ab0b-16f0116426b5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.583346 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsqr5\" (UniqueName: \"kubernetes.io/projected/ac094822-6272-4730-ab0b-16f0116426b5-kube-api-access-zsqr5\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.583397 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.583410 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac094822-6272-4730-ab0b-16f0116426b5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.740317 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:26 crc kubenswrapper[4898]: E0313 14:34:26.740651 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.794090 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" event={"ID":"ac094822-6272-4730-ab0b-16f0116426b5","Type":"ContainerDied","Data":"704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d"} Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.794150 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704f436dc38746438fb453814824552f6f864e1b290a23b87caa2e264e34538d" Mar 13 14:34:26 crc kubenswrapper[4898]: I0313 14:34:26.794295 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.020955 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8wvs"] Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021600 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021623 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021654 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021662 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021679 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021687 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021712 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021720 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="extract-content" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021737 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" containerName="oc" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021744 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" containerName="oc" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021766 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021773 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021793 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac094822-6272-4730-ab0b-16f0116426b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021802 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac094822-6272-4730-ab0b-16f0116426b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: E0313 14:34:27.021828 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.021836 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="extract-utilities" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022108 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1116c6-c423-4585-af50-c9ecdca3720e" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022127 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" containerName="oc" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022141 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac094822-6272-4730-ab0b-16f0116426b5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.022163 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0528f01c-62c6-4665-9b64-b20182ed6aad" containerName="registry-server" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.023261 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.025467 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.027211 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.028307 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.029447 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.038991 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8wvs"] Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.097482 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.098230 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.098582 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.200969 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.201174 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.201359 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.206503 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.207399 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.225630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"ssh-known-hosts-edpm-deployment-x8wvs\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:27 crc kubenswrapper[4898]: I0313 14:34:27.362668 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:28 crc kubenswrapper[4898]: I0313 14:34:28.014130 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8wvs"] Mar 13 14:34:28 crc kubenswrapper[4898]: I0313 14:34:28.820372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerStarted","Data":"e0e9da157b1c32a35018510698432124395ad27481524dc7ff26e350853d5be5"} Mar 13 14:34:29 crc kubenswrapper[4898]: I0313 14:34:29.860843 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerStarted","Data":"b3c420da5c21ae8a4f33923e8d7d1d38ba37569fd225b705660bb76182968d0f"} Mar 13 14:34:29 crc kubenswrapper[4898]: I0313 14:34:29.899158 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" podStartSLOduration=3.339618398 podStartE2EDuration="3.89912204s" podCreationTimestamp="2026-03-13 14:34:26 +0000 UTC" firstStartedPulling="2026-03-13 14:34:28.031112841 +0000 UTC m=+2303.032701070" lastFinishedPulling="2026-03-13 14:34:28.590616433 +0000 UTC m=+2303.592204712" observedRunningTime="2026-03-13 14:34:29.883480895 +0000 UTC m=+2304.885069174" watchObservedRunningTime="2026-03-13 14:34:29.89912204 +0000 UTC m=+2304.900710319" Mar 13 14:34:35 crc kubenswrapper[4898]: I0313 14:34:35.946121 4898 generic.go:334] "Generic (PLEG): container finished" podID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerID="b3c420da5c21ae8a4f33923e8d7d1d38ba37569fd225b705660bb76182968d0f" exitCode=0 Mar 13 14:34:35 crc kubenswrapper[4898]: I0313 14:34:35.946231 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerDied","Data":"b3c420da5c21ae8a4f33923e8d7d1d38ba37569fd225b705660bb76182968d0f"} Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.572679 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.723371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") pod \"e7c70549-1fc7-42c2-8c81-075c611671ae\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.723525 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") pod \"e7c70549-1fc7-42c2-8c81-075c611671ae\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.723702 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") pod \"e7c70549-1fc7-42c2-8c81-075c611671ae\" (UID: \"e7c70549-1fc7-42c2-8c81-075c611671ae\") " Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.729659 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg" (OuterVolumeSpecName: "kube-api-access-nkklg") pod "e7c70549-1fc7-42c2-8c81-075c611671ae" (UID: "e7c70549-1fc7-42c2-8c81-075c611671ae"). InnerVolumeSpecName "kube-api-access-nkklg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.743827 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:37 crc kubenswrapper[4898]: E0313 14:34:37.745002 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.757238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7c70549-1fc7-42c2-8c81-075c611671ae" (UID: "e7c70549-1fc7-42c2-8c81-075c611671ae"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.783215 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e7c70549-1fc7-42c2-8c81-075c611671ae" (UID: "e7c70549-1fc7-42c2-8c81-075c611671ae"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.826610 4898 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.826645 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkklg\" (UniqueName: \"kubernetes.io/projected/e7c70549-1fc7-42c2-8c81-075c611671ae-kube-api-access-nkklg\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.826658 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7c70549-1fc7-42c2-8c81-075c611671ae-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.972459 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" event={"ID":"e7c70549-1fc7-42c2-8c81-075c611671ae","Type":"ContainerDied","Data":"e0e9da157b1c32a35018510698432124395ad27481524dc7ff26e350853d5be5"} Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.972497 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0e9da157b1c32a35018510698432124395ad27481524dc7ff26e350853d5be5" Mar 13 14:34:37 crc kubenswrapper[4898]: I0313 14:34:37.972556 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8wvs" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.081470 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b"] Mar 13 14:34:38 crc kubenswrapper[4898]: E0313 14:34:38.082448 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerName="ssh-known-hosts-edpm-deployment" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.082491 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerName="ssh-known-hosts-edpm-deployment" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.083092 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c70549-1fc7-42c2-8c81-075c611671ae" containerName="ssh-known-hosts-edpm-deployment" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.085047 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.087685 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.087684 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.087807 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.088824 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.113846 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b"] Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.235703 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.235869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.235922 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.338795 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.338843 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.339027 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.346666 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.353870 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.369638 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-sjx4b\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:38 crc kubenswrapper[4898]: I0313 14:34:38.417478 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:39 crc kubenswrapper[4898]: I0313 14:34:39.032016 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b"] Mar 13 14:34:39 crc kubenswrapper[4898]: W0313 14:34:39.037035 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99 WatchSource:0}: Error finding container c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99: Status 404 returned error can't find the container with id c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99 Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.000320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerStarted","Data":"ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21"} Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.000789 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerStarted","Data":"c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99"} Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.029643 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" podStartSLOduration=1.492955453 podStartE2EDuration="2.029625198s" podCreationTimestamp="2026-03-13 14:34:38 +0000 UTC" firstStartedPulling="2026-03-13 14:34:39.041093748 +0000 UTC m=+2314.042681997" lastFinishedPulling="2026-03-13 14:34:39.577763473 +0000 UTC m=+2314.579351742" observedRunningTime="2026-03-13 14:34:40.019330218 +0000 UTC m=+2315.020918497" watchObservedRunningTime="2026-03-13 14:34:40.029625198 +0000 UTC m=+2315.031213447" Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.115151 4898 scope.go:117] "RemoveContainer" containerID="390b3ed23857cd84f527a8c8b365a228b6c0b2caebb2767f64baba810ca56690" Mar 13 14:34:40 crc kubenswrapper[4898]: I0313 14:34:40.163120 4898 scope.go:117] "RemoveContainer" containerID="23e456f4a6227ca0f6e6f99f4c35a21b09d57519ec2a733d94a113420fb1a340" Mar 13 14:34:48 crc kubenswrapper[4898]: E0313 14:34:48.293282 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-conmon-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:34:48 crc kubenswrapper[4898]: E0313 14:34:48.293743 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05d3f0e4_c029_4e2f_a3c1_471faa671767.slice/crio-conmon-ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:34:49 crc kubenswrapper[4898]: I0313 14:34:49.181602 4898 generic.go:334] "Generic (PLEG): container finished" podID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerID="ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21" exitCode=0 Mar 13 14:34:49 crc kubenswrapper[4898]: I0313 14:34:49.181658 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerDied","Data":"ed700be5cb65d6fed4e57657125463e8261e0c0244d561f570e062a4cc5f6b21"} Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.820005 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.934708 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") pod \"05d3f0e4-c029-4e2f-a3c1-471faa671767\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.934934 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") pod \"05d3f0e4-c029-4e2f-a3c1-471faa671767\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.935235 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") pod \"05d3f0e4-c029-4e2f-a3c1-471faa671767\" (UID: \"05d3f0e4-c029-4e2f-a3c1-471faa671767\") " Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.942554 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28" (OuterVolumeSpecName: "kube-api-access-7bh28") pod "05d3f0e4-c029-4e2f-a3c1-471faa671767" (UID: "05d3f0e4-c029-4e2f-a3c1-471faa671767"). InnerVolumeSpecName "kube-api-access-7bh28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.983123 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory" (OuterVolumeSpecName: "inventory") pod "05d3f0e4-c029-4e2f-a3c1-471faa671767" (UID: "05d3f0e4-c029-4e2f-a3c1-471faa671767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:50 crc kubenswrapper[4898]: I0313 14:34:50.996370 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "05d3f0e4-c029-4e2f-a3c1-471faa671767" (UID: "05d3f0e4-c029-4e2f-a3c1-471faa671767"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.041020 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.041453 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/05d3f0e4-c029-4e2f-a3c1-471faa671767-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.041481 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bh28\" (UniqueName: \"kubernetes.io/projected/05d3f0e4-c029-4e2f-a3c1-471faa671767-kube-api-access-7bh28\") on node \"crc\" DevicePath \"\"" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.214342 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" event={"ID":"05d3f0e4-c029-4e2f-a3c1-471faa671767","Type":"ContainerDied","Data":"c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99"} Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.214406 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c31aab1886b7f3a89065d0dc4d79b7428ae41e2dc20698018084260c555bce99" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.214437 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-sjx4b" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.306774 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2"] Mar 13 14:34:51 crc kubenswrapper[4898]: E0313 14:34:51.307758 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.307790 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.308233 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="05d3f0e4-c029-4e2f-a3c1-471faa671767" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.310101 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.312587 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.312742 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.312871 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.313435 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.335404 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2"] Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.455992 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.456120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.456260 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.558594 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.558755 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.558821 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.565321 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.566325 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.587644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.634590 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:34:51 crc kubenswrapper[4898]: I0313 14:34:51.740074 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:34:51 crc kubenswrapper[4898]: E0313 14:34:51.740644 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:34:52 crc kubenswrapper[4898]: I0313 14:34:52.340353 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2"] Mar 13 14:34:52 crc kubenswrapper[4898]: W0313 14:34:52.346031 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a674c4a_b209_4ea0_83b0_c46f820a81ef.slice/crio-f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1 WatchSource:0}: Error finding container f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1: Status 404 returned error can't find the container with id f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1 Mar 13 14:34:53 crc kubenswrapper[4898]: I0313 14:34:53.240008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerStarted","Data":"a6752dd3dce237a9a2a6569325a7b2fac6103bbdc3acf7b6791bf41cab42bec9"} Mar 13 14:34:53 crc kubenswrapper[4898]: I0313 14:34:53.240733 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerStarted","Data":"f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1"} Mar 13 14:34:53 crc kubenswrapper[4898]: I0313 14:34:53.267939 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" podStartSLOduration=1.75544904 podStartE2EDuration="2.267917875s" podCreationTimestamp="2026-03-13 14:34:51 +0000 UTC" firstStartedPulling="2026-03-13 14:34:52.351233258 +0000 UTC m=+2327.352821507" lastFinishedPulling="2026-03-13 14:34:52.863702073 +0000 UTC m=+2327.865290342" observedRunningTime="2026-03-13 14:34:53.258273872 +0000 UTC m=+2328.259862121" watchObservedRunningTime="2026-03-13 14:34:53.267917875 +0000 UTC m=+2328.269506114" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.771153 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.786370 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.800366 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.906800 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.910721 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:02 crc kubenswrapper[4898]: I0313 14:35:02.911550 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.015973 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.016327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.016642 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.016508 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.017213 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.047725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"certified-operators-pn2m4\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.125249 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.406062 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerID="a6752dd3dce237a9a2a6569325a7b2fac6103bbdc3acf7b6791bf41cab42bec9" exitCode=0 Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.406246 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerDied","Data":"a6752dd3dce237a9a2a6569325a7b2fac6103bbdc3acf7b6791bf41cab42bec9"} Mar 13 14:35:03 crc kubenswrapper[4898]: I0313 14:35:03.692950 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:04 crc kubenswrapper[4898]: I0313 14:35:04.418955 4898 generic.go:334] "Generic (PLEG): container finished" podID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" exitCode=0 Mar 13 14:35:04 crc kubenswrapper[4898]: I0313 14:35:04.419412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73"} Mar 13 14:35:04 crc kubenswrapper[4898]: I0313 14:35:04.419444 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerStarted","Data":"bfcd0705440e9b0ea6531f51eb75c71426d7ce4348588ec527180e05d3f093f4"} Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.161749 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.197854 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") pod \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.198098 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") pod \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.198318 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") pod \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\" (UID: \"8a674c4a-b209-4ea0-83b0-c46f820a81ef\") " Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.229413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh" (OuterVolumeSpecName: "kube-api-access-lbvxh") pod "8a674c4a-b209-4ea0-83b0-c46f820a81ef" (UID: "8a674c4a-b209-4ea0-83b0-c46f820a81ef"). InnerVolumeSpecName "kube-api-access-lbvxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.260408 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory" (OuterVolumeSpecName: "inventory") pod "8a674c4a-b209-4ea0-83b0-c46f820a81ef" (UID: "8a674c4a-b209-4ea0-83b0-c46f820a81ef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.304325 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbvxh\" (UniqueName: \"kubernetes.io/projected/8a674c4a-b209-4ea0-83b0-c46f820a81ef-kube-api-access-lbvxh\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.304359 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.348008 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a674c4a-b209-4ea0-83b0-c46f820a81ef" (UID: "8a674c4a-b209-4ea0-83b0-c46f820a81ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.407070 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a674c4a-b209-4ea0-83b0-c46f820a81ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.441174 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" event={"ID":"8a674c4a-b209-4ea0-83b0-c46f820a81ef","Type":"ContainerDied","Data":"f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1"} Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.441230 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f675076d96f0d80dec8b437957588d6a2cd269730bfb83e736cbc1f9dc1093c1" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.441290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.542616 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4"] Mar 13 14:35:05 crc kubenswrapper[4898]: E0313 14:35:05.543194 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.543211 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.543445 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a674c4a-b209-4ea0-83b0-c46f820a81ef" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.544356 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548381 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548526 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548757 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.548979 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549125 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549249 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549355 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549468 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.549573 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.554628 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4"] Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715403 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715441 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.715489 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716137 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716250 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716313 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716651 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716697 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716793 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.716981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.717139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.717346 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.749379 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:05 crc kubenswrapper[4898]: E0313 14:35:05.749796 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819377 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819445 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819466 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819485 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819519 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819567 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819608 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819628 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819653 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819737 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819757 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819784 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819816 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819849 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.819884 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825190 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825198 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825380 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825496 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825606 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825702 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.825777 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.826048 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.827151 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.828358 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.828380 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.828434 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.830859 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.831715 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.833354 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.835054 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.835841 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.836494 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.837308 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.838947 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.840025 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.847748 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-v8np4\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.862931 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:35:05 crc kubenswrapper[4898]: I0313 14:35:05.870391 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:06 crc kubenswrapper[4898]: I0313 14:35:06.465392 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerStarted","Data":"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff"} Mar 13 14:35:06 crc kubenswrapper[4898]: I0313 14:35:06.480206 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4"] Mar 13 14:35:06 crc kubenswrapper[4898]: I0313 14:35:06.983041 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.490674 4898 generic.go:334] "Generic (PLEG): container finished" podID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" exitCode=0 Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.490815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff"} Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.494439 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerStarted","Data":"4d4d0c6bb15a7ffb8076e9638afa814524f427eb62b0c041389d30a596cbe573"} Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.494495 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerStarted","Data":"0f88e54431807638c0cb8ce49f8e1de1c35418597acc9f02e4aaae31eeb717ac"} Mar 13 14:35:07 crc kubenswrapper[4898]: I0313 14:35:07.571506 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" podStartSLOduration=2.070900767 podStartE2EDuration="2.57147057s" podCreationTimestamp="2026-03-13 14:35:05 +0000 UTC" firstStartedPulling="2026-03-13 14:35:06.47951496 +0000 UTC m=+2341.481103209" lastFinishedPulling="2026-03-13 14:35:06.980084763 +0000 UTC m=+2341.981673012" observedRunningTime="2026-03-13 14:35:07.55643724 +0000 UTC m=+2342.558025499" watchObservedRunningTime="2026-03-13 14:35:07.57147057 +0000 UTC m=+2342.573058849" Mar 13 14:35:08 crc kubenswrapper[4898]: I0313 14:35:08.512824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerStarted","Data":"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493"} Mar 13 14:35:08 crc kubenswrapper[4898]: I0313 14:35:08.541018 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pn2m4" podStartSLOduration=3.019966021 podStartE2EDuration="6.54099266s" podCreationTimestamp="2026-03-13 14:35:02 +0000 UTC" firstStartedPulling="2026-03-13 14:35:04.422202034 +0000 UTC m=+2339.423790273" lastFinishedPulling="2026-03-13 14:35:07.943228653 +0000 UTC m=+2342.944816912" observedRunningTime="2026-03-13 14:35:08.538332073 +0000 UTC m=+2343.539920352" watchObservedRunningTime="2026-03-13 14:35:08.54099266 +0000 UTC m=+2343.542580929" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.126446 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.128445 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.215281 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.642072 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:13 crc kubenswrapper[4898]: I0313 14:35:13.698968 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:15 crc kubenswrapper[4898]: I0313 14:35:15.600600 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pn2m4" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" containerID="cri-o://3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" gracePeriod=2 Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.189832 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.350197 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") pod \"f6bb9c39-7999-48d1-9223-d7408aa31f47\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.351428 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") pod \"f6bb9c39-7999-48d1-9223-d7408aa31f47\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.351458 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") pod \"f6bb9c39-7999-48d1-9223-d7408aa31f47\" (UID: \"f6bb9c39-7999-48d1-9223-d7408aa31f47\") " Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.352586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities" (OuterVolumeSpecName: "utilities") pod "f6bb9c39-7999-48d1-9223-d7408aa31f47" (UID: "f6bb9c39-7999-48d1-9223-d7408aa31f47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.359853 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb" (OuterVolumeSpecName: "kube-api-access-kx6lb") pod "f6bb9c39-7999-48d1-9223-d7408aa31f47" (UID: "f6bb9c39-7999-48d1-9223-d7408aa31f47"). InnerVolumeSpecName "kube-api-access-kx6lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.423578 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6bb9c39-7999-48d1-9223-d7408aa31f47" (UID: "f6bb9c39-7999-48d1-9223-d7408aa31f47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.454477 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx6lb\" (UniqueName: \"kubernetes.io/projected/f6bb9c39-7999-48d1-9223-d7408aa31f47-kube-api-access-kx6lb\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.454506 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.454516 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6bb9c39-7999-48d1-9223-d7408aa31f47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613306 4898 generic.go:334] "Generic (PLEG): container finished" podID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" exitCode=0 Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613346 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493"} Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pn2m4" event={"ID":"f6bb9c39-7999-48d1-9223-d7408aa31f47","Type":"ContainerDied","Data":"bfcd0705440e9b0ea6531f51eb75c71426d7ce4348588ec527180e05d3f093f4"} Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613388 4898 scope.go:117] "RemoveContainer" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.613510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pn2m4" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.658269 4898 scope.go:117] "RemoveContainer" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.660104 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.671106 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pn2m4"] Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.692524 4898 scope.go:117] "RemoveContainer" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.763480 4898 scope.go:117] "RemoveContainer" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" Mar 13 14:35:16 crc kubenswrapper[4898]: E0313 14:35:16.765102 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493\": container with ID starting with 3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493 not found: ID does not exist" containerID="3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765141 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493"} err="failed to get container status \"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493\": rpc error: code = NotFound desc = could not find container \"3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493\": container with ID starting with 3a9399a0e65bba8a088956e07dc644ccc3d505a7a24e307c444048c5dcc01493 not found: ID does not exist" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765165 4898 scope.go:117] "RemoveContainer" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" Mar 13 14:35:16 crc kubenswrapper[4898]: E0313 14:35:16.765551 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff\": container with ID starting with f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff not found: ID does not exist" containerID="f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765573 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff"} err="failed to get container status \"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff\": rpc error: code = NotFound desc = could not find container \"f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff\": container with ID starting with f831d93e0380c3eac2c80a54bbf98f48dd5744a6bf12651a02a2962fc6ee89ff not found: ID does not exist" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.765585 4898 scope.go:117] "RemoveContainer" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" Mar 13 14:35:16 crc kubenswrapper[4898]: E0313 14:35:16.766159 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73\": container with ID starting with 6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73 not found: ID does not exist" containerID="6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73" Mar 13 14:35:16 crc kubenswrapper[4898]: I0313 14:35:16.766181 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73"} err="failed to get container status \"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73\": rpc error: code = NotFound desc = could not find container \"6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73\": container with ID starting with 6747046a2f3b217dc89e876d5e6f55535fc4c9de9dd5242237e2cb0cede37a73 not found: ID does not exist" Mar 13 14:35:17 crc kubenswrapper[4898]: I0313 14:35:17.762021 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" path="/var/lib/kubelet/pods/f6bb9c39-7999-48d1-9223-d7408aa31f47/volumes" Mar 13 14:35:20 crc kubenswrapper[4898]: I0313 14:35:20.742065 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:20 crc kubenswrapper[4898]: E0313 14:35:20.742593 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:34 crc kubenswrapper[4898]: I0313 14:35:34.741254 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:34 crc kubenswrapper[4898]: E0313 14:35:34.742263 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:41 crc kubenswrapper[4898]: I0313 14:35:41.101450 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:35:41 crc kubenswrapper[4898]: I0313 14:35:41.117967 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-kxtcf"] Mar 13 14:35:41 crc kubenswrapper[4898]: I0313 14:35:41.759201 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd78a2a-1bb4-461a-92cd-d705080b087a" path="/var/lib/kubelet/pods/2cd78a2a-1bb4-461a-92cd-d705080b087a/volumes" Mar 13 14:35:49 crc kubenswrapper[4898]: I0313 14:35:49.739998 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:35:49 crc kubenswrapper[4898]: E0313 14:35:49.740776 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:35:52 crc kubenswrapper[4898]: I0313 14:35:52.072388 4898 generic.go:334] "Generic (PLEG): container finished" podID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerID="4d4d0c6bb15a7ffb8076e9638afa814524f427eb62b0c041389d30a596cbe573" exitCode=0 Mar 13 14:35:52 crc kubenswrapper[4898]: I0313 14:35:52.072447 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerDied","Data":"4d4d0c6bb15a7ffb8076e9638afa814524f427eb62b0c041389d30a596cbe573"} Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.618384 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.760684 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.760766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.760819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762805 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762882 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762946 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.762993 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763041 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763070 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763156 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763192 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763254 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763302 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763353 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763440 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.763466 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") pod \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\" (UID: \"8c6bec5a-faac-4793-8c18-9f5b2faf2c95\") " Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.770040 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb" (OuterVolumeSpecName: "kube-api-access-j8wdb") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "kube-api-access-j8wdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.770519 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.771714 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.773385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.773926 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.774016 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.774077 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.775564 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.775716 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.776755 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.777181 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.777436 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.778143 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.784919 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.805185 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory" (OuterVolumeSpecName: "inventory") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.806367 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8c6bec5a-faac-4793-8c18-9f5b2faf2c95" (UID: "8c6bec5a-faac-4793-8c18-9f5b2faf2c95"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868508 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868564 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8wdb\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-kube-api-access-j8wdb\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868585 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868604 4898 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868621 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868634 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868649 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868667 4898 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868682 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868699 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868712 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868728 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868742 4898 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868755 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868766 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:53 crc kubenswrapper[4898]: I0313 14:35:53.868780 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8c6bec5a-faac-4793-8c18-9f5b2faf2c95-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.102862 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" event={"ID":"8c6bec5a-faac-4793-8c18-9f5b2faf2c95","Type":"ContainerDied","Data":"0f88e54431807638c0cb8ce49f8e1de1c35418597acc9f02e4aaae31eeb717ac"} Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.102942 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f88e54431807638c0cb8ce49f8e1de1c35418597acc9f02e4aaae31eeb717ac" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.103080 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-v8np4" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.242981 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx"] Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244091 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-utilities" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244116 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-utilities" Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-content" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244135 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="extract-content" Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244153 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244161 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" Mar 13 14:35:54 crc kubenswrapper[4898]: E0313 14:35:54.244183 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244195 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244512 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6bb9c39-7999-48d1-9223-d7408aa31f47" containerName="registry-server" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.244531 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6bec5a-faac-4793-8c18-9f5b2faf2c95" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.245820 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.248978 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.249323 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.249585 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.249866 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.251302 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.253572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx"] Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387370 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387451 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387540 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387681 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.387796 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491216 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491418 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491548 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491760 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.491871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.493550 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.497731 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.498208 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.503330 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.518674 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-6xdgx\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:54 crc kubenswrapper[4898]: I0313 14:35:54.568823 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:35:55 crc kubenswrapper[4898]: I0313 14:35:55.213938 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx"] Mar 13 14:35:55 crc kubenswrapper[4898]: W0313 14:35:55.226818 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9f7be15_746c_45be_92a1_2fa2a961f636.slice/crio-9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4 WatchSource:0}: Error finding container 9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4: Status 404 returned error can't find the container with id 9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4 Mar 13 14:35:56 crc kubenswrapper[4898]: I0313 14:35:56.137018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerStarted","Data":"0b3a8f0331d4cc53c2a293a3d7861b92f3ef327313fba53309af365c126af301"} Mar 13 14:35:56 crc kubenswrapper[4898]: I0313 14:35:56.137318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerStarted","Data":"9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4"} Mar 13 14:35:56 crc kubenswrapper[4898]: I0313 14:35:56.167199 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" podStartSLOduration=1.66787519 podStartE2EDuration="2.167174862s" podCreationTimestamp="2026-03-13 14:35:54 +0000 UTC" firstStartedPulling="2026-03-13 14:35:55.230632574 +0000 UTC m=+2390.232220823" lastFinishedPulling="2026-03-13 14:35:55.729932246 +0000 UTC m=+2390.731520495" observedRunningTime="2026-03-13 14:35:56.153937268 +0000 UTC m=+2391.155525517" watchObservedRunningTime="2026-03-13 14:35:56.167174862 +0000 UTC m=+2391.168763111" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.140460 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.142888 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.147200 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.147371 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.147526 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.155137 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.184229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"auto-csr-approver-29556876-j2sld\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.286654 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"auto-csr-approver-29556876-j2sld\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.311855 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"auto-csr-approver-29556876-j2sld\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:00 crc kubenswrapper[4898]: I0313 14:36:00.485063 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:01 crc kubenswrapper[4898]: I0313 14:36:01.010296 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:36:01 crc kubenswrapper[4898]: W0313 14:36:01.018085 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode83b21e9_13bc_4f80_a228_126fbc98c8f6.slice/crio-f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c WatchSource:0}: Error finding container f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c: Status 404 returned error can't find the container with id f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c Mar 13 14:36:01 crc kubenswrapper[4898]: I0313 14:36:01.221515 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556876-j2sld" event={"ID":"e83b21e9-13bc-4f80-a228-126fbc98c8f6","Type":"ContainerStarted","Data":"f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c"} Mar 13 14:36:01 crc kubenswrapper[4898]: I0313 14:36:01.752117 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:01 crc kubenswrapper[4898]: E0313 14:36:01.755431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:03 crc kubenswrapper[4898]: I0313 14:36:03.252227 4898 generic.go:334] "Generic (PLEG): container finished" podID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerID="880ec0d7753626dc3ced87b5a1086a85612fac87d9f534c6f11457452f7a1041" exitCode=0 Mar 13 14:36:03 crc kubenswrapper[4898]: I0313 14:36:03.252597 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556876-j2sld" event={"ID":"e83b21e9-13bc-4f80-a228-126fbc98c8f6","Type":"ContainerDied","Data":"880ec0d7753626dc3ced87b5a1086a85612fac87d9f534c6f11457452f7a1041"} Mar 13 14:36:04 crc kubenswrapper[4898]: I0313 14:36:04.831197 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.014269 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") pod \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\" (UID: \"e83b21e9-13bc-4f80-a228-126fbc98c8f6\") " Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.021793 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p" (OuterVolumeSpecName: "kube-api-access-nxm4p") pod "e83b21e9-13bc-4f80-a228-126fbc98c8f6" (UID: "e83b21e9-13bc-4f80-a228-126fbc98c8f6"). InnerVolumeSpecName "kube-api-access-nxm4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.117881 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxm4p\" (UniqueName: \"kubernetes.io/projected/e83b21e9-13bc-4f80-a228-126fbc98c8f6-kube-api-access-nxm4p\") on node \"crc\" DevicePath \"\"" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.279275 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556876-j2sld" event={"ID":"e83b21e9-13bc-4f80-a228-126fbc98c8f6","Type":"ContainerDied","Data":"f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c"} Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.279595 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d583d9ca13af4b6dadd70e08a52a1d4fd61e75828f367de6d9ea530df49b9c" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.279346 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556876-j2sld" Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.922197 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:36:05 crc kubenswrapper[4898]: I0313 14:36:05.932199 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556870-ndlzh"] Mar 13 14:36:07 crc kubenswrapper[4898]: I0313 14:36:07.766701 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f21c0b-a6a1-4b44-ae38-4a382569154e" path="/var/lib/kubelet/pods/c4f21c0b-a6a1-4b44-ae38-4a382569154e/volumes" Mar 13 14:36:13 crc kubenswrapper[4898]: I0313 14:36:13.740495 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:13 crc kubenswrapper[4898]: E0313 14:36:13.741399 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:21 crc kubenswrapper[4898]: I0313 14:36:21.067275 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:36:21 crc kubenswrapper[4898]: I0313 14:36:21.079653 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-rzjjp"] Mar 13 14:36:21 crc kubenswrapper[4898]: I0313 14:36:21.750283 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe" path="/var/lib/kubelet/pods/c0161ae1-f4ee-4688-bdb5-12eacf0dbcbe/volumes" Mar 13 14:36:24 crc kubenswrapper[4898]: I0313 14:36:24.740568 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:24 crc kubenswrapper[4898]: E0313 14:36:24.742516 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:37 crc kubenswrapper[4898]: I0313 14:36:37.740834 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:37 crc kubenswrapper[4898]: E0313 14:36:37.742457 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:36:40 crc kubenswrapper[4898]: I0313 14:36:40.479451 4898 scope.go:117] "RemoveContainer" containerID="797b7877d34540edd204a0c5f49e93f47ceac114fc9a4ba968964ef3cae04ffa" Mar 13 14:36:40 crc kubenswrapper[4898]: I0313 14:36:40.525957 4898 scope.go:117] "RemoveContainer" containerID="b1aa895f0022b3e7758a22e19e58e18bbca3560c415ba389688c7c2191911abd" Mar 13 14:36:40 crc kubenswrapper[4898]: I0313 14:36:40.594313 4898 scope.go:117] "RemoveContainer" containerID="86e66a360586b19f53ca12cefc2c560bd5016283db35bb1e56ee1d68892fd634" Mar 13 14:36:50 crc kubenswrapper[4898]: I0313 14:36:50.739746 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:36:50 crc kubenswrapper[4898]: E0313 14:36:50.742309 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:01 crc kubenswrapper[4898]: I0313 14:37:01.990429 4898 generic.go:334] "Generic (PLEG): container finished" podID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerID="0b3a8f0331d4cc53c2a293a3d7861b92f3ef327313fba53309af365c126af301" exitCode=0 Mar 13 14:37:01 crc kubenswrapper[4898]: I0313 14:37:01.990544 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerDied","Data":"0b3a8f0331d4cc53c2a293a3d7861b92f3ef327313fba53309af365c126af301"} Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.519348 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.693867 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694043 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694096 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694144 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.694262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") pod \"a9f7be15-746c-45be-92a1-2fa2a961f636\" (UID: \"a9f7be15-746c-45be-92a1-2fa2a961f636\") " Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.701024 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl" (OuterVolumeSpecName: "kube-api-access-26rhl") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "kube-api-access-26rhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.704283 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.726746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.733504 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory" (OuterVolumeSpecName: "inventory") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.753025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a9f7be15-746c-45be-92a1-2fa2a961f636" (UID: "a9f7be15-746c-45be-92a1-2fa2a961f636"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798507 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26rhl\" (UniqueName: \"kubernetes.io/projected/a9f7be15-746c-45be-92a1-2fa2a961f636-kube-api-access-26rhl\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798594 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798625 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798713 4898 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9f7be15-746c-45be-92a1-2fa2a961f636-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:03 crc kubenswrapper[4898]: I0313 14:37:03.798749 4898 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a9f7be15-746c-45be-92a1-2fa2a961f636-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.015621 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" event={"ID":"a9f7be15-746c-45be-92a1-2fa2a961f636","Type":"ContainerDied","Data":"9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4"} Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.015662 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1eedb3314bba2694120e0c082f648d290dda603ff79d45db17c947105d1cd4" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.016306 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-6xdgx" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.252741 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2"] Mar 13 14:37:04 crc kubenswrapper[4898]: E0313 14:37:04.254057 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerName="oc" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254084 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerName="oc" Mar 13 14:37:04 crc kubenswrapper[4898]: E0313 14:37:04.254127 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254135 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254382 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" containerName="oc" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.254407 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9f7be15-746c-45be-92a1-2fa2a961f636" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.255248 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.258531 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.258633 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.259033 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.259200 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.259205 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.262470 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.298720 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2"] Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.413964 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414415 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414464 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414548 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414579 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.414606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516707 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516842 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516887 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516959 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.516987 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.517015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.521164 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.522525 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.522727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.523224 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.527706 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.536691 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.575233 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:04 crc kubenswrapper[4898]: I0313 14:37:04.740375 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:04 crc kubenswrapper[4898]: E0313 14:37:04.741165 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:05 crc kubenswrapper[4898]: I0313 14:37:05.124381 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2"] Mar 13 14:37:05 crc kubenswrapper[4898]: I0313 14:37:05.136247 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:37:06 crc kubenswrapper[4898]: I0313 14:37:06.040613 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerStarted","Data":"f6584ee1397c9a4b280cbe5d9477967c5edf6ee4ae6ac58ff2fdd8dbdfaa85a0"} Mar 13 14:37:06 crc kubenswrapper[4898]: I0313 14:37:06.041115 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerStarted","Data":"cb1aca5bdad3fc291c97dcbe66d830c038286152338aad152dcc9c7f0a0c4841"} Mar 13 14:37:06 crc kubenswrapper[4898]: I0313 14:37:06.072452 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" podStartSLOduration=1.6204208150000001 podStartE2EDuration="2.072431174s" podCreationTimestamp="2026-03-13 14:37:04 +0000 UTC" firstStartedPulling="2026-03-13 14:37:05.136011709 +0000 UTC m=+2460.137599948" lastFinishedPulling="2026-03-13 14:37:05.588022028 +0000 UTC m=+2460.589610307" observedRunningTime="2026-03-13 14:37:06.057506587 +0000 UTC m=+2461.059094826" watchObservedRunningTime="2026-03-13 14:37:06.072431174 +0000 UTC m=+2461.074019423" Mar 13 14:37:17 crc kubenswrapper[4898]: I0313 14:37:17.739789 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:17 crc kubenswrapper[4898]: E0313 14:37:17.740983 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:31 crc kubenswrapper[4898]: I0313 14:37:31.741172 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:31 crc kubenswrapper[4898]: E0313 14:37:31.743296 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:46 crc kubenswrapper[4898]: I0313 14:37:46.740224 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:46 crc kubenswrapper[4898]: E0313 14:37:46.741275 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:56 crc kubenswrapper[4898]: I0313 14:37:56.714845 4898 generic.go:334] "Generic (PLEG): container finished" podID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerID="f6584ee1397c9a4b280cbe5d9477967c5edf6ee4ae6ac58ff2fdd8dbdfaa85a0" exitCode=0 Mar 13 14:37:56 crc kubenswrapper[4898]: I0313 14:37:56.714977 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerDied","Data":"f6584ee1397c9a4b280cbe5d9477967c5edf6ee4ae6ac58ff2fdd8dbdfaa85a0"} Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.305304 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.407539 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.408216 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.408709 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.408826 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.409002 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.409059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") pod \"abb37cb2-ec06-4c96-882f-7781fbe053e0\" (UID: \"abb37cb2-ec06-4c96-882f-7781fbe053e0\") " Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.413548 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx" (OuterVolumeSpecName: "kube-api-access-zkwxx") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "kube-api-access-zkwxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.417111 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.447103 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.455713 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.464722 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.482939 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory" (OuterVolumeSpecName: "inventory") pod "abb37cb2-ec06-4c96-882f-7781fbe053e0" (UID: "abb37cb2-ec06-4c96-882f-7781fbe053e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513217 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513258 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513278 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513293 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwxx\" (UniqueName: \"kubernetes.io/projected/abb37cb2-ec06-4c96-882f-7781fbe053e0-kube-api-access-zkwxx\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513306 4898 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.513319 4898 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/abb37cb2-ec06-4c96-882f-7781fbe053e0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.741075 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:37:58 crc kubenswrapper[4898]: E0313 14:37:58.741935 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.743793 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" event={"ID":"abb37cb2-ec06-4c96-882f-7781fbe053e0","Type":"ContainerDied","Data":"cb1aca5bdad3fc291c97dcbe66d830c038286152338aad152dcc9c7f0a0c4841"} Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.743845 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.743857 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb1aca5bdad3fc291c97dcbe66d830c038286152338aad152dcc9c7f0a0c4841" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.873927 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z"] Mar 13 14:37:58 crc kubenswrapper[4898]: E0313 14:37:58.874660 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.874694 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.874960 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb37cb2-ec06-4c96-882f-7781fbe053e0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.875813 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882400 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882677 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882729 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.882796 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.886497 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:37:58 crc kubenswrapper[4898]: I0313 14:37:58.887518 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z"] Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032180 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032319 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032596 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032657 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.032720 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.135828 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.135891 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.135951 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.136088 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.136135 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.140812 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.140869 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.142972 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.153019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.155167 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.201670 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:37:59 crc kubenswrapper[4898]: W0313 14:37:59.858568 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod226c01c4_d0f3_4784_8e93_36d1de6d593f.slice/crio-c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b WatchSource:0}: Error finding container c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b: Status 404 returned error can't find the container with id c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b Mar 13 14:37:59 crc kubenswrapper[4898]: I0313 14:37:59.869368 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z"] Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.153885 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.157167 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.160852 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.161116 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.162942 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.188928 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.271373 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"auto-csr-approver-29556878-btqw2\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.374290 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"auto-csr-approver-29556878-btqw2\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.402882 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"auto-csr-approver-29556878-btqw2\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.508141 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.772841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerStarted","Data":"8d123c34cb14e74bcadc841aa33ca89cf1efe34ff10b94c6d0b5690f3c9b0353"} Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.773226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerStarted","Data":"c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b"} Mar 13 14:38:00 crc kubenswrapper[4898]: I0313 14:38:00.793667 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" podStartSLOduration=2.325259152 podStartE2EDuration="2.793648183s" podCreationTimestamp="2026-03-13 14:37:58 +0000 UTC" firstStartedPulling="2026-03-13 14:37:59.861634554 +0000 UTC m=+2514.863222823" lastFinishedPulling="2026-03-13 14:38:00.330023605 +0000 UTC m=+2515.331611854" observedRunningTime="2026-03-13 14:38:00.793224202 +0000 UTC m=+2515.794812461" watchObservedRunningTime="2026-03-13 14:38:00.793648183 +0000 UTC m=+2515.795236442" Mar 13 14:38:01 crc kubenswrapper[4898]: I0313 14:38:01.049882 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:38:01 crc kubenswrapper[4898]: W0313 14:38:01.054912 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39249464_ab82_4938_978e_2ffcbc637f4f.slice/crio-db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9 WatchSource:0}: Error finding container db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9: Status 404 returned error can't find the container with id db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9 Mar 13 14:38:01 crc kubenswrapper[4898]: I0313 14:38:01.790894 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerStarted","Data":"db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9"} Mar 13 14:38:02 crc kubenswrapper[4898]: I0313 14:38:02.811040 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerStarted","Data":"9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e"} Mar 13 14:38:02 crc kubenswrapper[4898]: I0313 14:38:02.841602 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556878-btqw2" podStartSLOduration=1.720811839 podStartE2EDuration="2.841578899s" podCreationTimestamp="2026-03-13 14:38:00 +0000 UTC" firstStartedPulling="2026-03-13 14:38:01.057891348 +0000 UTC m=+2516.059479597" lastFinishedPulling="2026-03-13 14:38:02.178658378 +0000 UTC m=+2517.180246657" observedRunningTime="2026-03-13 14:38:02.834796715 +0000 UTC m=+2517.836384984" watchObservedRunningTime="2026-03-13 14:38:02.841578899 +0000 UTC m=+2517.843167148" Mar 13 14:38:03 crc kubenswrapper[4898]: I0313 14:38:03.824701 4898 generic.go:334] "Generic (PLEG): container finished" podID="39249464-ab82-4938-978e-2ffcbc637f4f" containerID="9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e" exitCode=0 Mar 13 14:38:03 crc kubenswrapper[4898]: I0313 14:38:03.824815 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerDied","Data":"9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e"} Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.313472 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.414119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") pod \"39249464-ab82-4938-978e-2ffcbc637f4f\" (UID: \"39249464-ab82-4938-978e-2ffcbc637f4f\") " Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.421212 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8" (OuterVolumeSpecName: "kube-api-access-b9dd8") pod "39249464-ab82-4938-978e-2ffcbc637f4f" (UID: "39249464-ab82-4938-978e-2ffcbc637f4f"). InnerVolumeSpecName "kube-api-access-b9dd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.518564 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9dd8\" (UniqueName: \"kubernetes.io/projected/39249464-ab82-4938-978e-2ffcbc637f4f-kube-api-access-b9dd8\") on node \"crc\" DevicePath \"\"" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.887472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556878-btqw2" event={"ID":"39249464-ab82-4938-978e-2ffcbc637f4f","Type":"ContainerDied","Data":"db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9"} Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.887724 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db760510b023c16dd3b8a56b9a79711f2e8f056c4dac065b95f3942e59ac8ca9" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.887557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556878-btqw2" Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.916053 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:38:05 crc kubenswrapper[4898]: I0313 14:38:05.929214 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556872-2tqn4"] Mar 13 14:38:07 crc kubenswrapper[4898]: I0313 14:38:07.761850 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8627002c-751e-4168-b294-4a324890a996" path="/var/lib/kubelet/pods/8627002c-751e-4168-b294-4a324890a996/volumes" Mar 13 14:38:13 crc kubenswrapper[4898]: I0313 14:38:13.739616 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:13 crc kubenswrapper[4898]: E0313 14:38:13.740579 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:38:27 crc kubenswrapper[4898]: I0313 14:38:27.741517 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:27 crc kubenswrapper[4898]: E0313 14:38:27.743549 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:38:39 crc kubenswrapper[4898]: I0313 14:38:39.740729 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:39 crc kubenswrapper[4898]: E0313 14:38:39.742169 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:38:40 crc kubenswrapper[4898]: I0313 14:38:40.815140 4898 scope.go:117] "RemoveContainer" containerID="9918c054f17d0c467592f1c4b30fc11e333ea544aa38b84c1aab31d2beff7c97" Mar 13 14:38:50 crc kubenswrapper[4898]: I0313 14:38:50.740582 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:38:51 crc kubenswrapper[4898]: I0313 14:38:51.525802 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836"} Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.160599 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:40:00 crc kubenswrapper[4898]: E0313 14:40:00.164812 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" containerName="oc" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.165335 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" containerName="oc" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.166341 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" containerName="oc" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.168011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.171153 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.171541 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.174118 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.174362 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.189459 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"auto-csr-approver-29556880-2rlcl\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.291618 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"auto-csr-approver-29556880-2rlcl\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.333221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"auto-csr-approver-29556880-2rlcl\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:00 crc kubenswrapper[4898]: I0313 14:40:00.512974 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:01 crc kubenswrapper[4898]: I0313 14:40:01.034989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:40:01 crc kubenswrapper[4898]: W0313 14:40:01.036849 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23d4e2ed_7457_458a_9c76_dcf8f3aadd99.slice/crio-330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4 WatchSource:0}: Error finding container 330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4: Status 404 returned error can't find the container with id 330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4 Mar 13 14:40:01 crc kubenswrapper[4898]: I0313 14:40:01.459297 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" event={"ID":"23d4e2ed-7457-458a-9c76-dcf8f3aadd99","Type":"ContainerStarted","Data":"330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4"} Mar 13 14:40:03 crc kubenswrapper[4898]: I0313 14:40:03.486481 4898 generic.go:334] "Generic (PLEG): container finished" podID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerID="a4b672dd5f62f7db5f72a2ba461417e4b17e1ad5affad388a08bfa992e5aa45e" exitCode=0 Mar 13 14:40:03 crc kubenswrapper[4898]: I0313 14:40:03.486556 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" event={"ID":"23d4e2ed-7457-458a-9c76-dcf8f3aadd99","Type":"ContainerDied","Data":"a4b672dd5f62f7db5f72a2ba461417e4b17e1ad5affad388a08bfa992e5aa45e"} Mar 13 14:40:04 crc kubenswrapper[4898]: I0313 14:40:04.965209 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.133959 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") pod \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\" (UID: \"23d4e2ed-7457-458a-9c76-dcf8f3aadd99\") " Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.155295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h" (OuterVolumeSpecName: "kube-api-access-v7f2h") pod "23d4e2ed-7457-458a-9c76-dcf8f3aadd99" (UID: "23d4e2ed-7457-458a-9c76-dcf8f3aadd99"). InnerVolumeSpecName "kube-api-access-v7f2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.238210 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7f2h\" (UniqueName: \"kubernetes.io/projected/23d4e2ed-7457-458a-9c76-dcf8f3aadd99-kube-api-access-v7f2h\") on node \"crc\" DevicePath \"\"" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.520501 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" event={"ID":"23d4e2ed-7457-458a-9c76-dcf8f3aadd99","Type":"ContainerDied","Data":"330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4"} Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.520541 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="330d3b30cd6ab021edffafbc6168be6383771cabc816664fe7fe8a93e3cf47c4" Mar 13 14:40:05 crc kubenswrapper[4898]: I0313 14:40:05.520596 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556880-2rlcl" Mar 13 14:40:06 crc kubenswrapper[4898]: I0313 14:40:06.064364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:40:06 crc kubenswrapper[4898]: I0313 14:40:06.072938 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556874-5w2n9"] Mar 13 14:40:07 crc kubenswrapper[4898]: I0313 14:40:07.790257 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb068c44-8492-4ed4-973b-f1233d9db645" path="/var/lib/kubelet/pods/eb068c44-8492-4ed4-973b-f1233d9db645/volumes" Mar 13 14:40:40 crc kubenswrapper[4898]: I0313 14:40:40.953459 4898 scope.go:117] "RemoveContainer" containerID="408d12ea5e972e9b868ac62eb57c8cf1a207c59938c5862250b310c3d0d4947f" Mar 13 14:41:19 crc kubenswrapper[4898]: I0313 14:41:19.134754 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:41:19 crc kubenswrapper[4898]: I0313 14:41:19.135559 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:41:49 crc kubenswrapper[4898]: I0313 14:41:49.134713 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:41:49 crc kubenswrapper[4898]: I0313 14:41:49.135426 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:41:51 crc kubenswrapper[4898]: I0313 14:41:51.998717 4898 generic.go:334] "Generic (PLEG): container finished" podID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerID="8d123c34cb14e74bcadc841aa33ca89cf1efe34ff10b94c6d0b5690f3c9b0353" exitCode=0 Mar 13 14:41:51 crc kubenswrapper[4898]: I0313 14:41:51.998773 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerDied","Data":"8d123c34cb14e74bcadc841aa33ca89cf1efe34ff10b94c6d0b5690f3c9b0353"} Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.623123 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.652841 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.652974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.653146 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.653236 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.653322 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.676059 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.687177 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv" (OuterVolumeSpecName: "kube-api-access-plnfv") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "kube-api-access-plnfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.731763 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: E0313 14:41:53.737361 4898 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory podName:226c01c4-d0f3-4784-8e93-36d1de6d593f nodeName:}" failed. No retries permitted until 2026-03-13 14:41:54.237327174 +0000 UTC m=+2749.238915433 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f") : error deleting /var/lib/kubelet/pods/226c01c4-d0f3-4784-8e93-36d1de6d593f/volume-subpaths: remove /var/lib/kubelet/pods/226c01c4-d0f3-4784-8e93-36d1de6d593f/volume-subpaths: no such file or directory Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.741648 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756391 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756423 4898 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756441 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plnfv\" (UniqueName: \"kubernetes.io/projected/226c01c4-d0f3-4784-8e93-36d1de6d593f-kube-api-access-plnfv\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:53 crc kubenswrapper[4898]: I0313 14:41:53.756452 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.026494 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" event={"ID":"226c01c4-d0f3-4784-8e93-36d1de6d593f","Type":"ContainerDied","Data":"c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b"} Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.026550 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ed48dbe017708bc1a827b7644836fd3073060755f704b693ca4fab0b30fb9b" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.026630 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133141 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg"] Mar 13 14:41:54 crc kubenswrapper[4898]: E0313 14:41:54.133585 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133602 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 14:41:54 crc kubenswrapper[4898]: E0313 14:41:54.133637 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerName="oc" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133645 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerName="oc" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133851 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" containerName="oc" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.133866 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="226c01c4-d0f3-4784-8e93-36d1de6d593f" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.134628 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.137463 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.137471 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.137468 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.166996 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167215 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167367 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167463 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167607 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.167715 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.182145 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg"] Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.269547 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") pod \"226c01c4-d0f3-4784-8e93-36d1de6d593f\" (UID: \"226c01c4-d0f3-4784-8e93-36d1de6d593f\") " Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270003 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270058 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270080 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270099 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270119 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270136 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270176 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270217 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270233 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.270284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.271538 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.273314 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.273547 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.273712 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.275051 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.275089 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.275105 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.276264 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.276334 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.278381 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.282950 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory" (OuterVolumeSpecName: "inventory") pod "226c01c4-d0f3-4784-8e93-36d1de6d593f" (UID: "226c01c4-d0f3-4784-8e93-36d1de6d593f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.286756 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"nova-edpm-deployment-openstack-edpm-ipam-28xpg\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.372501 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/226c01c4-d0f3-4784-8e93-36d1de6d593f-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:41:54 crc kubenswrapper[4898]: I0313 14:41:54.492660 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:41:55 crc kubenswrapper[4898]: I0313 14:41:55.059036 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg"] Mar 13 14:41:56 crc kubenswrapper[4898]: I0313 14:41:56.049731 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerStarted","Data":"70859fbfeb0fb1276f0e4311e36fe620c68c8fac7970aae8138f23a5b9f896be"} Mar 13 14:41:56 crc kubenswrapper[4898]: I0313 14:41:56.050124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerStarted","Data":"2be41a54a106745535a45aba88fa9b87d13a15b26a46e0adacc9d9b51e76bede"} Mar 13 14:41:56 crc kubenswrapper[4898]: I0313 14:41:56.086140 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" podStartSLOduration=1.495427 podStartE2EDuration="2.086120322s" podCreationTimestamp="2026-03-13 14:41:54 +0000 UTC" firstStartedPulling="2026-03-13 14:41:55.066346506 +0000 UTC m=+2750.067934755" lastFinishedPulling="2026-03-13 14:41:55.657039838 +0000 UTC m=+2750.658628077" observedRunningTime="2026-03-13 14:41:56.08447156 +0000 UTC m=+2751.086059839" watchObservedRunningTime="2026-03-13 14:41:56.086120322 +0000 UTC m=+2751.087708571" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.141628 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.143774 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.147877 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.147959 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.147909 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.152021 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.315146 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"auto-csr-approver-29556882-6vdgp\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.417385 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"auto-csr-approver-29556882-6vdgp\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.435522 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"auto-csr-approver-29556882-6vdgp\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.467041 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:00 crc kubenswrapper[4898]: I0313 14:42:00.939225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:42:01 crc kubenswrapper[4898]: I0313 14:42:01.099824 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" event={"ID":"46dff21f-c9aa-443a-b1c7-988721788744","Type":"ContainerStarted","Data":"226ecfa378c63a7ac9d79f6bc968ebd3e31e06dd97440f7c704521eaec1c25eb"} Mar 13 14:42:03 crc kubenswrapper[4898]: I0313 14:42:03.135924 4898 generic.go:334] "Generic (PLEG): container finished" podID="46dff21f-c9aa-443a-b1c7-988721788744" containerID="5dfdf7dc37e2c03d23dcf11c2bda6721f5e0189a55bee1c413509ed3a8808306" exitCode=0 Mar 13 14:42:03 crc kubenswrapper[4898]: I0313 14:42:03.136310 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" event={"ID":"46dff21f-c9aa-443a-b1c7-988721788744","Type":"ContainerDied","Data":"5dfdf7dc37e2c03d23dcf11c2bda6721f5e0189a55bee1c413509ed3a8808306"} Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.610294 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.678585 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") pod \"46dff21f-c9aa-443a-b1c7-988721788744\" (UID: \"46dff21f-c9aa-443a-b1c7-988721788744\") " Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.684545 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs" (OuterVolumeSpecName: "kube-api-access-g6ljs") pod "46dff21f-c9aa-443a-b1c7-988721788744" (UID: "46dff21f-c9aa-443a-b1c7-988721788744"). InnerVolumeSpecName "kube-api-access-g6ljs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:42:05 crc kubenswrapper[4898]: I0313 14:42:05.781974 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6ljs\" (UniqueName: \"kubernetes.io/projected/46dff21f-c9aa-443a-b1c7-988721788744-kube-api-access-g6ljs\") on node \"crc\" DevicePath \"\"" Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.225085 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" event={"ID":"46dff21f-c9aa-443a-b1c7-988721788744","Type":"ContainerDied","Data":"226ecfa378c63a7ac9d79f6bc968ebd3e31e06dd97440f7c704521eaec1c25eb"} Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.225146 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="226ecfa378c63a7ac9d79f6bc968ebd3e31e06dd97440f7c704521eaec1c25eb" Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.225221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556882-6vdgp" Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.691225 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:42:06 crc kubenswrapper[4898]: I0313 14:42:06.701637 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556876-j2sld"] Mar 13 14:42:07 crc kubenswrapper[4898]: I0313 14:42:07.760985 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e83b21e9-13bc-4f80-a228-126fbc98c8f6" path="/var/lib/kubelet/pods/e83b21e9-13bc-4f80-a228-126fbc98c8f6/volumes" Mar 13 14:42:08 crc kubenswrapper[4898]: E0313 14:42:08.906334 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:13 crc kubenswrapper[4898]: E0313 14:42:13.604833 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.134623 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.134963 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.135019 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.136043 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.136203 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836" gracePeriod=600 Mar 13 14:42:19 crc kubenswrapper[4898]: E0313 14:42:19.167025 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767eecef_3bc9_4db4_a0cb_5d9c8554c62d.slice/crio-610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.396674 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836" exitCode=0 Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.396923 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836"} Mar 13 14:42:19 crc kubenswrapper[4898]: I0313 14:42:19.397249 4898 scope.go:117] "RemoveContainer" containerID="8cc019eb997b8fb12b1a6d732269f56fd31f05139eb4098b4fb342e4364fe0db" Mar 13 14:42:20 crc kubenswrapper[4898]: I0313 14:42:20.408984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5"} Mar 13 14:42:28 crc kubenswrapper[4898]: E0313 14:42:28.847246 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:29 crc kubenswrapper[4898]: E0313 14:42:29.217259 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:39 crc kubenswrapper[4898]: E0313 14:42:39.581959 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:41 crc kubenswrapper[4898]: I0313 14:42:41.119850 4898 scope.go:117] "RemoveContainer" containerID="880ec0d7753626dc3ced87b5a1086a85612fac87d9f534c6f11457452f7a1041" Mar 13 14:42:43 crc kubenswrapper[4898]: E0313 14:42:43.602577 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:48 crc kubenswrapper[4898]: E0313 14:42:48.254730 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:48 crc kubenswrapper[4898]: E0313 14:42:48.254734 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:49 crc kubenswrapper[4898]: E0313 14:42:49.628484 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:58 crc kubenswrapper[4898]: E0313 14:42:58.946117 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:42:59 crc kubenswrapper[4898]: E0313 14:42:59.687456 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46dff21f_c9aa_443a_b1c7_988721788744.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.193595 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:43:50 crc kubenswrapper[4898]: E0313 14:43:50.194603 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46dff21f-c9aa-443a-b1c7-988721788744" containerName="oc" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.194617 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="46dff21f-c9aa-443a-b1c7-988721788744" containerName="oc" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.194958 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="46dff21f-c9aa-443a-b1c7-988721788744" containerName="oc" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.197039 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.211013 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.280750 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-utilities\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.281238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szbp\" (UniqueName: \"kubernetes.io/projected/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-kube-api-access-4szbp\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.281533 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-catalog-content\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383297 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-catalog-content\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-utilities\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383890 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-utilities\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.383927 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-catalog-content\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.384801 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szbp\" (UniqueName: \"kubernetes.io/projected/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-kube-api-access-4szbp\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.416966 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szbp\" (UniqueName: \"kubernetes.io/projected/2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e-kube-api-access-4szbp\") pod \"community-operators-fcfmz\" (UID: \"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e\") " pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:50 crc kubenswrapper[4898]: I0313 14:43:50.531020 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.121603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.952301 4898 generic.go:334] "Generic (PLEG): container finished" podID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerID="de57d30bfb39ff3d400ef8caa8ccece22bc11c6e01ef51fff685669ac60fb3cb" exitCode=0 Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.952441 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerDied","Data":"de57d30bfb39ff3d400ef8caa8ccece22bc11c6e01ef51fff685669ac60fb3cb"} Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.953431 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerStarted","Data":"bba0555b7b58e4030810787bca615b1597cf609758e6d666bfe9aeec2f91ba59"} Mar 13 14:43:51 crc kubenswrapper[4898]: I0313 14:43:51.954719 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:43:57 crc kubenswrapper[4898]: I0313 14:43:57.018260 4898 generic.go:334] "Generic (PLEG): container finished" podID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerID="ff8823ec8283c190f689d06300aff1538109e7ec55797335db2fe37e571d0171" exitCode=0 Mar 13 14:43:57 crc kubenswrapper[4898]: I0313 14:43:57.018336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerDied","Data":"ff8823ec8283c190f689d06300aff1538109e7ec55797335db2fe37e571d0171"} Mar 13 14:43:59 crc kubenswrapper[4898]: I0313 14:43:59.048407 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fcfmz" event={"ID":"2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e","Type":"ContainerStarted","Data":"1d82c05cde25f5d47134bce4e0b97b775c14d95d3b7690a9f75b30eaf4f13545"} Mar 13 14:43:59 crc kubenswrapper[4898]: I0313 14:43:59.099091 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fcfmz" podStartSLOduration=3.503307437 podStartE2EDuration="9.099050494s" podCreationTimestamp="2026-03-13 14:43:50 +0000 UTC" firstStartedPulling="2026-03-13 14:43:51.954528748 +0000 UTC m=+2866.956116987" lastFinishedPulling="2026-03-13 14:43:57.550271795 +0000 UTC m=+2872.551860044" observedRunningTime="2026-03-13 14:43:59.075149712 +0000 UTC m=+2874.076738001" watchObservedRunningTime="2026-03-13 14:43:59.099050494 +0000 UTC m=+2874.100638813" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.158698 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.160521 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.164210 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.164550 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.165439 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.206787 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.256869 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"auto-csr-approver-29556884-wvc75\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.359009 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"auto-csr-approver-29556884-wvc75\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.381866 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"auto-csr-approver-29556884-wvc75\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.497374 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.531573 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:00 crc kubenswrapper[4898]: I0313 14:44:00.538129 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:01 crc kubenswrapper[4898]: I0313 14:44:01.029014 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:44:01 crc kubenswrapper[4898]: I0313 14:44:01.070387 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556884-wvc75" event={"ID":"22d70d9e-a058-43a7-b692-19cd302d65ca","Type":"ContainerStarted","Data":"ba1e66524699b6445f436669f57528c41766bd573cf4969fd64f6902f9aa0242"} Mar 13 14:44:01 crc kubenswrapper[4898]: I0313 14:44:01.600294 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output=< Mar 13 14:44:01 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:44:01 crc kubenswrapper[4898]: > Mar 13 14:44:04 crc kubenswrapper[4898]: I0313 14:44:04.112047 4898 generic.go:334] "Generic (PLEG): container finished" podID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerID="668ec92e00de6e908be7a4b238e021ba041b8ee50f571d510eea90e125398f41" exitCode=0 Mar 13 14:44:04 crc kubenswrapper[4898]: I0313 14:44:04.112276 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556884-wvc75" event={"ID":"22d70d9e-a058-43a7-b692-19cd302d65ca","Type":"ContainerDied","Data":"668ec92e00de6e908be7a4b238e021ba041b8ee50f571d510eea90e125398f41"} Mar 13 14:44:05 crc kubenswrapper[4898]: I0313 14:44:05.857128 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.013574 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") pod \"22d70d9e-a058-43a7-b692-19cd302d65ca\" (UID: \"22d70d9e-a058-43a7-b692-19cd302d65ca\") " Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.020429 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr" (OuterVolumeSpecName: "kube-api-access-szgkr") pod "22d70d9e-a058-43a7-b692-19cd302d65ca" (UID: "22d70d9e-a058-43a7-b692-19cd302d65ca"). InnerVolumeSpecName "kube-api-access-szgkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.116467 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szgkr\" (UniqueName: \"kubernetes.io/projected/22d70d9e-a058-43a7-b692-19cd302d65ca-kube-api-access-szgkr\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.140505 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556884-wvc75" event={"ID":"22d70d9e-a058-43a7-b692-19cd302d65ca","Type":"ContainerDied","Data":"ba1e66524699b6445f436669f57528c41766bd573cf4969fd64f6902f9aa0242"} Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.140796 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba1e66524699b6445f436669f57528c41766bd573cf4969fd64f6902f9aa0242" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.140566 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556884-wvc75" Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.955615 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:44:06 crc kubenswrapper[4898]: I0313 14:44:06.979419 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556878-btqw2"] Mar 13 14:44:07 crc kubenswrapper[4898]: I0313 14:44:07.756238 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39249464-ab82-4938-978e-2ffcbc637f4f" path="/var/lib/kubelet/pods/39249464-ab82-4938-978e-2ffcbc637f4f/volumes" Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.598030 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.684556 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fcfmz" Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.798646 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fcfmz"] Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.847299 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:44:10 crc kubenswrapper[4898]: I0313 14:44:10.847522 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nf9mj" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" containerID="cri-o://fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993" gracePeriod=2 Mar 13 14:44:11 crc kubenswrapper[4898]: E0313 14:44:11.187417 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod112ac477_caf1_4778_9161_737e393633b6.slice/crio-conmon-fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993.scope\": RecentStats: unable to find data in memory cache]" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.227606 4898 generic.go:334] "Generic (PLEG): container finished" podID="112ac477-caf1-4778-9161-737e393633b6" containerID="fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993" exitCode=0 Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.227698 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993"} Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.390764 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.496393 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") pod \"112ac477-caf1-4778-9161-737e393633b6\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.496545 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") pod \"112ac477-caf1-4778-9161-737e393633b6\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.496581 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") pod \"112ac477-caf1-4778-9161-737e393633b6\" (UID: \"112ac477-caf1-4778-9161-737e393633b6\") " Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.497498 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities" (OuterVolumeSpecName: "utilities") pod "112ac477-caf1-4778-9161-737e393633b6" (UID: "112ac477-caf1-4778-9161-737e393633b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.510776 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9" (OuterVolumeSpecName: "kube-api-access-z75w9") pod "112ac477-caf1-4778-9161-737e393633b6" (UID: "112ac477-caf1-4778-9161-737e393633b6"). InnerVolumeSpecName "kube-api-access-z75w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.599369 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "112ac477-caf1-4778-9161-737e393633b6" (UID: "112ac477-caf1-4778-9161-737e393633b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.601146 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.601164 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/112ac477-caf1-4778-9161-737e393633b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:11 crc kubenswrapper[4898]: I0313 14:44:11.601173 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75w9\" (UniqueName: \"kubernetes.io/projected/112ac477-caf1-4778-9161-737e393633b6-kube-api-access-z75w9\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.240864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nf9mj" event={"ID":"112ac477-caf1-4778-9161-737e393633b6","Type":"ContainerDied","Data":"cd6fb5a64e7e71f880fc652eb99feafe0af0462b20f85dff2e92b55a99558f6f"} Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.240907 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nf9mj" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.241186 4898 scope.go:117] "RemoveContainer" containerID="fc60b3fabeb85294acc6203495a9c722283813bdd45f914886621b87378fb993" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.272637 4898 scope.go:117] "RemoveContainer" containerID="2e43cc9675fd8cbd58a911f4205da76807b5b902a7fc3bd0c4cb735298e250c5" Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.275546 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.290242 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nf9mj"] Mar 13 14:44:12 crc kubenswrapper[4898]: I0313 14:44:12.303117 4898 scope.go:117] "RemoveContainer" containerID="2316a3e4d2fc964fff3bdea961936abc15de57cb9446d0c3ca366fa8840b5460" Mar 13 14:44:13 crc kubenswrapper[4898]: I0313 14:44:13.758996 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="112ac477-caf1-4778-9161-737e393633b6" path="/var/lib/kubelet/pods/112ac477-caf1-4778-9161-737e393633b6/volumes" Mar 13 14:44:19 crc kubenswrapper[4898]: I0313 14:44:19.135044 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:44:19 crc kubenswrapper[4898]: I0313 14:44:19.135961 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.102214 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103694 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103716 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103740 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-content" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103751 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-content" Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103779 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerName="oc" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103792 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerName="oc" Mar 13 14:44:22 crc kubenswrapper[4898]: E0313 14:44:22.103820 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-utilities" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.103835 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="extract-utilities" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.104288 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" containerName="oc" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.104308 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="112ac477-caf1-4778-9161-737e393633b6" containerName="registry-server" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.107236 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.124426 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.181951 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.182120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.182193 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303318 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303563 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303678 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.303992 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.304124 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.330369 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"redhat-operators-jpqqv\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.434374 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:22 crc kubenswrapper[4898]: I0313 14:44:22.991676 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.383973 4898 generic.go:334] "Generic (PLEG): container finished" podID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerID="70859fbfeb0fb1276f0e4311e36fe620c68c8fac7970aae8138f23a5b9f896be" exitCode=0 Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.384040 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerDied","Data":"70859fbfeb0fb1276f0e4311e36fe620c68c8fac7970aae8138f23a5b9f896be"} Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.385789 4898 generic.go:334] "Generic (PLEG): container finished" podID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" exitCode=0 Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.385840 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec"} Mar 13 14:44:23 crc kubenswrapper[4898]: I0313 14:44:23.385890 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerStarted","Data":"934bdccbb64d24f7c4d0c481aec18c3675c14a5604b4ef1370a465435fcca680"} Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.401940 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerStarted","Data":"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e"} Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.927654 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.991835 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.992808 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993092 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993128 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993262 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993299 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993324 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993457 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993520 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993549 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.993612 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") pod \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\" (UID: \"acaa3912-3e27-4272-8e4a-3ab67fd34b92\") " Mar 13 14:44:24 crc kubenswrapper[4898]: I0313 14:44:24.999829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.000460 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x" (OuterVolumeSpecName: "kube-api-access-kwz9x") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "kube-api-access-kwz9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.030797 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory" (OuterVolumeSpecName: "inventory") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.041746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.043516 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.054614 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.056417 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.063986 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.067435 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.069178 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.075217 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "acaa3912-3e27-4272-8e4a-3ab67fd34b92" (UID: "acaa3912-3e27-4272-8e4a-3ab67fd34b92"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100715 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwz9x\" (UniqueName: \"kubernetes.io/projected/acaa3912-3e27-4272-8e4a-3ab67fd34b92-kube-api-access-kwz9x\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100746 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100757 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100766 4898 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100775 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100784 4898 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100792 4898 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100802 4898 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100810 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100819 4898 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.100827 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/acaa3912-3e27-4272-8e4a-3ab67fd34b92-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.412190 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" event={"ID":"acaa3912-3e27-4272-8e4a-3ab67fd34b92","Type":"ContainerDied","Data":"2be41a54a106745535a45aba88fa9b87d13a15b26a46e0adacc9d9b51e76bede"} Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.412243 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2be41a54a106745535a45aba88fa9b87d13a15b26a46e0adacc9d9b51e76bede" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.412404 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-28xpg" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.529941 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk"] Mar 13 14:44:25 crc kubenswrapper[4898]: E0313 14:44:25.530417 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.530435 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.530681 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="acaa3912-3e27-4272-8e4a-3ab67fd34b92" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.531532 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.535715 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.535846 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.537096 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.538400 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.538420 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.555291 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk"] Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611061 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611109 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611130 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611647 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.611967 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.612074 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.612120 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.714780 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.714932 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715038 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715104 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715259 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715312 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.715346 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.719420 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.719461 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.719636 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.720140 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.720533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.722781 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.733640 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:25 crc kubenswrapper[4898]: I0313 14:44:25.853649 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:44:26 crc kubenswrapper[4898]: I0313 14:44:26.417854 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk"] Mar 13 14:44:27 crc kubenswrapper[4898]: I0313 14:44:27.450233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerStarted","Data":"eb475c12bddce2c97f158483e36ec6344049764d71c6a03112569a01e066938c"} Mar 13 14:44:28 crc kubenswrapper[4898]: I0313 14:44:28.473033 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerStarted","Data":"b68526abfbbd65229d0fa636d4985f81e76151523259cd0cea1b6513d33ed080"} Mar 13 14:44:28 crc kubenswrapper[4898]: I0313 14:44:28.513447 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" podStartSLOduration=2.838635402 podStartE2EDuration="3.513420596s" podCreationTimestamp="2026-03-13 14:44:25 +0000 UTC" firstStartedPulling="2026-03-13 14:44:26.426361908 +0000 UTC m=+2901.427950157" lastFinishedPulling="2026-03-13 14:44:27.101147092 +0000 UTC m=+2902.102735351" observedRunningTime="2026-03-13 14:44:28.507686359 +0000 UTC m=+2903.509274618" watchObservedRunningTime="2026-03-13 14:44:28.513420596 +0000 UTC m=+2903.515008845" Mar 13 14:44:29 crc kubenswrapper[4898]: I0313 14:44:29.498665 4898 generic.go:334] "Generic (PLEG): container finished" podID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" exitCode=0 Mar 13 14:44:29 crc kubenswrapper[4898]: I0313 14:44:29.498734 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e"} Mar 13 14:44:30 crc kubenswrapper[4898]: I0313 14:44:30.517032 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerStarted","Data":"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af"} Mar 13 14:44:31 crc kubenswrapper[4898]: I0313 14:44:31.574359 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jpqqv" podStartSLOduration=3.02834831 podStartE2EDuration="9.574328953s" podCreationTimestamp="2026-03-13 14:44:22 +0000 UTC" firstStartedPulling="2026-03-13 14:44:23.387948016 +0000 UTC m=+2898.389536265" lastFinishedPulling="2026-03-13 14:44:29.933928649 +0000 UTC m=+2904.935516908" observedRunningTime="2026-03-13 14:44:31.558506678 +0000 UTC m=+2906.560094937" watchObservedRunningTime="2026-03-13 14:44:31.574328953 +0000 UTC m=+2906.575917262" Mar 13 14:44:32 crc kubenswrapper[4898]: I0313 14:44:32.436110 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:32 crc kubenswrapper[4898]: I0313 14:44:32.436395 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:33 crc kubenswrapper[4898]: I0313 14:44:33.490862 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jpqqv" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" probeResult="failure" output=< Mar 13 14:44:33 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:44:33 crc kubenswrapper[4898]: > Mar 13 14:44:41 crc kubenswrapper[4898]: I0313 14:44:41.263836 4898 scope.go:117] "RemoveContainer" containerID="9b860152e27ff03f1039fc3f0a1f9cf3aa08903cd38847b8eba8da6dc52d2b6e" Mar 13 14:44:42 crc kubenswrapper[4898]: I0313 14:44:42.535729 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:42 crc kubenswrapper[4898]: I0313 14:44:42.632506 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:42 crc kubenswrapper[4898]: I0313 14:44:42.787391 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:43 crc kubenswrapper[4898]: I0313 14:44:43.685405 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jpqqv" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" containerID="cri-o://5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" gracePeriod=2 Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.267966 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.376456 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") pod \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.376577 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") pod \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.376621 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") pod \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\" (UID: \"ff59b74b-017c-4c31-8171-8e2f6ee07a75\") " Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.377735 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities" (OuterVolumeSpecName: "utilities") pod "ff59b74b-017c-4c31-8171-8e2f6ee07a75" (UID: "ff59b74b-017c-4c31-8171-8e2f6ee07a75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.394155 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7" (OuterVolumeSpecName: "kube-api-access-pnxp7") pod "ff59b74b-017c-4c31-8171-8e2f6ee07a75" (UID: "ff59b74b-017c-4c31-8171-8e2f6ee07a75"). InnerVolumeSpecName "kube-api-access-pnxp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.479887 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.479941 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnxp7\" (UniqueName: \"kubernetes.io/projected/ff59b74b-017c-4c31-8171-8e2f6ee07a75-kube-api-access-pnxp7\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.548743 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff59b74b-017c-4c31-8171-8e2f6ee07a75" (UID: "ff59b74b-017c-4c31-8171-8e2f6ee07a75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.582165 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59b74b-017c-4c31-8171-8e2f6ee07a75-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.700830 4898 generic.go:334] "Generic (PLEG): container finished" podID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" exitCode=0 Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.700878 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af"} Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.700997 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpqqv" event={"ID":"ff59b74b-017c-4c31-8171-8e2f6ee07a75","Type":"ContainerDied","Data":"934bdccbb64d24f7c4d0c481aec18c3675c14a5604b4ef1370a465435fcca680"} Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.701031 4898 scope.go:117] "RemoveContainer" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.701031 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpqqv" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.734972 4898 scope.go:117] "RemoveContainer" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.763277 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.771699 4898 scope.go:117] "RemoveContainer" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.772092 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jpqqv"] Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.859733 4898 scope.go:117] "RemoveContainer" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" Mar 13 14:44:44 crc kubenswrapper[4898]: E0313 14:44:44.860345 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af\": container with ID starting with 5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af not found: ID does not exist" containerID="5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.860446 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af"} err="failed to get container status \"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af\": rpc error: code = NotFound desc = could not find container \"5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af\": container with ID starting with 5683c6073bc83bbe06ca94de9a1e4a7c969eadb55ff8f2c0ff39a667913c86af not found: ID does not exist" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.860522 4898 scope.go:117] "RemoveContainer" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" Mar 13 14:44:44 crc kubenswrapper[4898]: E0313 14:44:44.861202 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e\": container with ID starting with 33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e not found: ID does not exist" containerID="33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.861295 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e"} err="failed to get container status \"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e\": rpc error: code = NotFound desc = could not find container \"33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e\": container with ID starting with 33a77b3f72c6613c240378bb6bf90de33fa04718f5308ea5a9802a4dbf9b904e not found: ID does not exist" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.861354 4898 scope.go:117] "RemoveContainer" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" Mar 13 14:44:44 crc kubenswrapper[4898]: E0313 14:44:44.861828 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec\": container with ID starting with 49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec not found: ID does not exist" containerID="49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec" Mar 13 14:44:44 crc kubenswrapper[4898]: I0313 14:44:44.861858 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec"} err="failed to get container status \"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec\": rpc error: code = NotFound desc = could not find container \"49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec\": container with ID starting with 49661d26cda8c7fc0772102e8672bfdce259e32cdf6e7240a670328588f113ec not found: ID does not exist" Mar 13 14:44:45 crc kubenswrapper[4898]: I0313 14:44:45.760646 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" path="/var/lib/kubelet/pods/ff59b74b-017c-4c31-8171-8e2f6ee07a75/volumes" Mar 13 14:44:49 crc kubenswrapper[4898]: I0313 14:44:49.134582 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:44:49 crc kubenswrapper[4898]: I0313 14:44:49.135404 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.946384 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:44:51 crc kubenswrapper[4898]: E0313 14:44:51.948092 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948120 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" Mar 13 14:44:51 crc kubenswrapper[4898]: E0313 14:44:51.948183 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-content" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948196 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-content" Mar 13 14:44:51 crc kubenswrapper[4898]: E0313 14:44:51.948236 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-utilities" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948252 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="extract-utilities" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.948680 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff59b74b-017c-4c31-8171-8e2f6ee07a75" containerName="registry-server" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.952045 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:51 crc kubenswrapper[4898]: I0313 14:44:51.965893 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.007537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.007817 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.008296 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.110925 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111056 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111524 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111604 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.111928 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.140165 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"redhat-marketplace-ncdbz\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.286317 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:44:52 crc kubenswrapper[4898]: W0313 14:44:52.818102 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a91e6fb_a1ff_4dec_a854_024ff312a9b6.slice/crio-4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e WatchSource:0}: Error finding container 4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e: Status 404 returned error can't find the container with id 4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e Mar 13 14:44:52 crc kubenswrapper[4898]: I0313 14:44:52.821518 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:44:53 crc kubenswrapper[4898]: I0313 14:44:53.846063 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" exitCode=0 Mar 13 14:44:53 crc kubenswrapper[4898]: I0313 14:44:53.846265 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79"} Mar 13 14:44:53 crc kubenswrapper[4898]: I0313 14:44:53.846567 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerStarted","Data":"4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e"} Mar 13 14:44:54 crc kubenswrapper[4898]: I0313 14:44:54.857979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerStarted","Data":"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4"} Mar 13 14:44:55 crc kubenswrapper[4898]: I0313 14:44:55.874025 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" exitCode=0 Mar 13 14:44:55 crc kubenswrapper[4898]: I0313 14:44:55.874054 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4"} Mar 13 14:44:56 crc kubenswrapper[4898]: I0313 14:44:56.885711 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerStarted","Data":"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a"} Mar 13 14:44:56 crc kubenswrapper[4898]: I0313 14:44:56.917489 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ncdbz" podStartSLOduration=3.4543903990000002 podStartE2EDuration="5.917471683s" podCreationTimestamp="2026-03-13 14:44:51 +0000 UTC" firstStartedPulling="2026-03-13 14:44:53.850697884 +0000 UTC m=+2928.852286133" lastFinishedPulling="2026-03-13 14:44:56.313779168 +0000 UTC m=+2931.315367417" observedRunningTime="2026-03-13 14:44:56.911080849 +0000 UTC m=+2931.912669098" watchObservedRunningTime="2026-03-13 14:44:56.917471683 +0000 UTC m=+2931.919059922" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.178338 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.181484 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.185326 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.185884 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.215441 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.325141 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.325496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.325722 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.427889 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.428066 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.428240 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.429124 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.437782 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.452644 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"collect-profiles-29556885-ncmr6\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:00 crc kubenswrapper[4898]: I0313 14:45:00.512793 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.019841 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 14:45:01 crc kubenswrapper[4898]: W0313 14:45:01.025793 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1711d9ce_262c_4c6c_930a_4148e62fae9e.slice/crio-62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7 WatchSource:0}: Error finding container 62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7: Status 404 returned error can't find the container with id 62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7 Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.955361 4898 generic.go:334] "Generic (PLEG): container finished" podID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerID="586dd830bc412bf8d165f328ec0120d6ccafcd1b6e8c6a0642a7f4464c15681b" exitCode=0 Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.955465 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" event={"ID":"1711d9ce-262c-4c6c-930a-4148e62fae9e","Type":"ContainerDied","Data":"586dd830bc412bf8d165f328ec0120d6ccafcd1b6e8c6a0642a7f4464c15681b"} Mar 13 14:45:01 crc kubenswrapper[4898]: I0313 14:45:01.957397 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" event={"ID":"1711d9ce-262c-4c6c-930a-4148e62fae9e","Type":"ContainerStarted","Data":"62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7"} Mar 13 14:45:02 crc kubenswrapper[4898]: I0313 14:45:02.288343 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:02 crc kubenswrapper[4898]: I0313 14:45:02.288742 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:02 crc kubenswrapper[4898]: I0313 14:45:02.380550 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.039878 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.104248 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.439453 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.541119 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") pod \"1711d9ce-262c-4c6c-930a-4148e62fae9e\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.541176 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") pod \"1711d9ce-262c-4c6c-930a-4148e62fae9e\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.541411 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") pod \"1711d9ce-262c-4c6c-930a-4148e62fae9e\" (UID: \"1711d9ce-262c-4c6c-930a-4148e62fae9e\") " Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.542079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume" (OuterVolumeSpecName: "config-volume") pod "1711d9ce-262c-4c6c-930a-4148e62fae9e" (UID: "1711d9ce-262c-4c6c-930a-4148e62fae9e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.548982 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1711d9ce-262c-4c6c-930a-4148e62fae9e" (UID: "1711d9ce-262c-4c6c-930a-4148e62fae9e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.551220 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c" (OuterVolumeSpecName: "kube-api-access-tm77c") pod "1711d9ce-262c-4c6c-930a-4148e62fae9e" (UID: "1711d9ce-262c-4c6c-930a-4148e62fae9e"). InnerVolumeSpecName "kube-api-access-tm77c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.645550 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1711d9ce-262c-4c6c-930a-4148e62fae9e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.645592 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1711d9ce-262c-4c6c-930a-4148e62fae9e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.645606 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm77c\" (UniqueName: \"kubernetes.io/projected/1711d9ce-262c-4c6c-930a-4148e62fae9e-kube-api-access-tm77c\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.995213 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.995835 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6" event={"ID":"1711d9ce-262c-4c6c-930a-4148e62fae9e","Type":"ContainerDied","Data":"62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7"} Mar 13 14:45:03 crc kubenswrapper[4898]: I0313 14:45:03.995871 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62729d26c2d8567b0f26cb53313f6a0403258a8124e902c3a511817667c898c7" Mar 13 14:45:04 crc kubenswrapper[4898]: I0313 14:45:04.541831 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:45:04 crc kubenswrapper[4898]: I0313 14:45:04.560632 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556840-sfz5h"] Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.010023 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ncdbz" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" containerID="cri-o://f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" gracePeriod=2 Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.669867 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.756816 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c222126e-abe0-43e6-95c8-cc6946c967ae" path="/var/lib/kubelet/pods/c222126e-abe0-43e6-95c8-cc6946c967ae/volumes" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.804468 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") pod \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.804566 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") pod \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.804771 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") pod \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\" (UID: \"3a91e6fb-a1ff-4dec-a854-024ff312a9b6\") " Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.805595 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities" (OuterVolumeSpecName: "utilities") pod "3a91e6fb-a1ff-4dec-a854-024ff312a9b6" (UID: "3a91e6fb-a1ff-4dec-a854-024ff312a9b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.812321 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x" (OuterVolumeSpecName: "kube-api-access-q4f5x") pod "3a91e6fb-a1ff-4dec-a854-024ff312a9b6" (UID: "3a91e6fb-a1ff-4dec-a854-024ff312a9b6"). InnerVolumeSpecName "kube-api-access-q4f5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.908357 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:05 crc kubenswrapper[4898]: I0313 14:45:05.908614 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4f5x\" (UniqueName: \"kubernetes.io/projected/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-kube-api-access-q4f5x\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024389 4898 generic.go:334] "Generic (PLEG): container finished" podID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" exitCode=0 Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a"} Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024493 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ncdbz" event={"ID":"3a91e6fb-a1ff-4dec-a854-024ff312a9b6","Type":"ContainerDied","Data":"4affc25dada5bfb7785755f3b2b81aead858ebbb458092ae3682efd511112a6e"} Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024515 4898 scope.go:117] "RemoveContainer" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.024525 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ncdbz" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.051820 4898 scope.go:117] "RemoveContainer" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.081148 4898 scope.go:117] "RemoveContainer" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.159630 4898 scope.go:117] "RemoveContainer" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" Mar 13 14:45:06 crc kubenswrapper[4898]: E0313 14:45:06.160288 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a\": container with ID starting with f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a not found: ID does not exist" containerID="f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160316 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a"} err="failed to get container status \"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a\": rpc error: code = NotFound desc = could not find container \"f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a\": container with ID starting with f1cc536f9a14da4dcfb4c53f9da5f50d2401fc70bf15eef66b76fa00896de54a not found: ID does not exist" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160336 4898 scope.go:117] "RemoveContainer" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" Mar 13 14:45:06 crc kubenswrapper[4898]: E0313 14:45:06.160813 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4\": container with ID starting with dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4 not found: ID does not exist" containerID="dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160880 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4"} err="failed to get container status \"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4\": rpc error: code = NotFound desc = could not find container \"dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4\": container with ID starting with dd734696b7f03a85bec3d008a2f4e9e12a1c13aee061ccea952c8ba25d5240d4 not found: ID does not exist" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.160945 4898 scope.go:117] "RemoveContainer" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" Mar 13 14:45:06 crc kubenswrapper[4898]: E0313 14:45:06.165409 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79\": container with ID starting with 9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79 not found: ID does not exist" containerID="9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.165494 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79"} err="failed to get container status \"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79\": rpc error: code = NotFound desc = could not find container \"9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79\": container with ID starting with 9ff48a4e3b925be9aa523bb5c54c537c3e87325476999ae2aa5cff85960abc79 not found: ID does not exist" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.205271 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a91e6fb-a1ff-4dec-a854-024ff312a9b6" (UID: "3a91e6fb-a1ff-4dec-a854-024ff312a9b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.216822 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a91e6fb-a1ff-4dec-a854-024ff312a9b6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.376273 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:45:06 crc kubenswrapper[4898]: I0313 14:45:06.395091 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ncdbz"] Mar 13 14:45:07 crc kubenswrapper[4898]: I0313 14:45:07.767981 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" path="/var/lib/kubelet/pods/3a91e6fb-a1ff-4dec-a854-024ff312a9b6/volumes" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.134831 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.135383 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.135432 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.136454 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:45:19 crc kubenswrapper[4898]: I0313 14:45:19.136518 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" gracePeriod=600 Mar 13 14:45:19 crc kubenswrapper[4898]: E0313 14:45:19.282236 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.230177 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" exitCode=0 Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.230415 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5"} Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.230572 4898 scope.go:117] "RemoveContainer" containerID="610faaa089f666c18409ed2be816ac54810449ed44b96f98a2303a40ac9c9836" Mar 13 14:45:20 crc kubenswrapper[4898]: I0313 14:45:20.231686 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:20 crc kubenswrapper[4898]: E0313 14:45:20.232252 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:31 crc kubenswrapper[4898]: I0313 14:45:31.740293 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:31 crc kubenswrapper[4898]: E0313 14:45:31.743839 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:41 crc kubenswrapper[4898]: I0313 14:45:41.384418 4898 scope.go:117] "RemoveContainer" containerID="dac7072f1900557a02d6c49c5a63ec387e9ac4b9e0b548d071acd63216fda826" Mar 13 14:45:45 crc kubenswrapper[4898]: I0313 14:45:45.746807 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:45 crc kubenswrapper[4898]: E0313 14:45:45.747829 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.972328 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.973828 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-content" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.973850 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-content" Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.973878 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerName="collect-profiles" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.973956 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerName="collect-profiles" Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.974007 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974021 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" Mar 13 14:45:54 crc kubenswrapper[4898]: E0313 14:45:54.974051 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-utilities" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974064 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="extract-utilities" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974504 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a91e6fb-a1ff-4dec-a854-024ff312a9b6" containerName="registry-server" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.974525 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" containerName="collect-profiles" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.977723 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:54 crc kubenswrapper[4898]: I0313 14:45:54.992019 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.084805 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.085029 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.085080 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.187722 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.187799 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.187982 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.188310 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.188480 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.211992 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"certified-operators-2rfzq\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.317457 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:45:55 crc kubenswrapper[4898]: I0313 14:45:55.831514 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:45:56 crc kubenswrapper[4898]: I0313 14:45:56.774055 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" exitCode=0 Mar 13 14:45:56 crc kubenswrapper[4898]: I0313 14:45:56.774128 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547"} Mar 13 14:45:56 crc kubenswrapper[4898]: I0313 14:45:56.774510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerStarted","Data":"90b947e908accbf4ac70479773da53012e339d540d41d5d9eb2140bc0236f4bf"} Mar 13 14:45:58 crc kubenswrapper[4898]: I0313 14:45:58.740117 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:45:58 crc kubenswrapper[4898]: E0313 14:45:58.741259 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:45:58 crc kubenswrapper[4898]: I0313 14:45:58.814119 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerStarted","Data":"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057"} Mar 13 14:45:59 crc kubenswrapper[4898]: I0313 14:45:59.829379 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" exitCode=0 Mar 13 14:45:59 crc kubenswrapper[4898]: I0313 14:45:59.829472 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057"} Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.161309 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.164847 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.169632 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.171683 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.174442 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.185065 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.250537 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"auto-csr-approver-29556886-vxgm7\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.353234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"auto-csr-approver-29556886-vxgm7\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.374477 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"auto-csr-approver-29556886-vxgm7\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.501946 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.841042 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerStarted","Data":"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d"} Mar 13 14:46:00 crc kubenswrapper[4898]: I0313 14:46:00.865034 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2rfzq" podStartSLOduration=3.359263488 podStartE2EDuration="6.865015662s" podCreationTimestamp="2026-03-13 14:45:54 +0000 UTC" firstStartedPulling="2026-03-13 14:45:56.77751347 +0000 UTC m=+2991.779101719" lastFinishedPulling="2026-03-13 14:46:00.283265644 +0000 UTC m=+2995.284853893" observedRunningTime="2026-03-13 14:46:00.856861502 +0000 UTC m=+2995.858449741" watchObservedRunningTime="2026-03-13 14:46:00.865015662 +0000 UTC m=+2995.866603901" Mar 13 14:46:01 crc kubenswrapper[4898]: I0313 14:46:01.017678 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:46:01 crc kubenswrapper[4898]: I0313 14:46:01.858650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerStarted","Data":"2babf4d42215345ab890f111f267fb619ebc479d5a10a4bdbcb5c63efa902834"} Mar 13 14:46:02 crc kubenswrapper[4898]: I0313 14:46:02.872333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerStarted","Data":"c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e"} Mar 13 14:46:02 crc kubenswrapper[4898]: I0313 14:46:02.915237 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" podStartSLOduration=1.592971773 podStartE2EDuration="2.915217128s" podCreationTimestamp="2026-03-13 14:46:00 +0000 UTC" firstStartedPulling="2026-03-13 14:46:01.019800065 +0000 UTC m=+2996.021388304" lastFinishedPulling="2026-03-13 14:46:02.34204539 +0000 UTC m=+2997.343633659" observedRunningTime="2026-03-13 14:46:02.892771265 +0000 UTC m=+2997.894359524" watchObservedRunningTime="2026-03-13 14:46:02.915217128 +0000 UTC m=+2997.916805377" Mar 13 14:46:03 crc kubenswrapper[4898]: I0313 14:46:03.884184 4898 generic.go:334] "Generic (PLEG): container finished" podID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerID="c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e" exitCode=0 Mar 13 14:46:03 crc kubenswrapper[4898]: I0313 14:46:03.884255 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerDied","Data":"c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e"} Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.320033 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.321367 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.428667 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.697805 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.805141 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") pod \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\" (UID: \"7e6f3996-1b26-4a53-8c2d-f74aa89ef944\") " Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.810962 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7" (OuterVolumeSpecName: "kube-api-access-gklq7") pod "7e6f3996-1b26-4a53-8c2d-f74aa89ef944" (UID: "7e6f3996-1b26-4a53-8c2d-f74aa89ef944"). InnerVolumeSpecName "kube-api-access-gklq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.910137 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gklq7\" (UniqueName: \"kubernetes.io/projected/7e6f3996-1b26-4a53-8c2d-f74aa89ef944-kube-api-access-gklq7\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.916639 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" event={"ID":"7e6f3996-1b26-4a53-8c2d-f74aa89ef944","Type":"ContainerDied","Data":"2babf4d42215345ab890f111f267fb619ebc479d5a10a4bdbcb5c63efa902834"} Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.916710 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2babf4d42215345ab890f111f267fb619ebc479d5a10a4bdbcb5c63efa902834" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.917013 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556886-vxgm7" Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.964134 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:46:05 crc kubenswrapper[4898]: I0313 14:46:05.975563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:06 crc kubenswrapper[4898]: I0313 14:46:06.004154 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556880-2rlcl"] Mar 13 14:46:06 crc kubenswrapper[4898]: I0313 14:46:06.029189 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:46:07 crc kubenswrapper[4898]: I0313 14:46:07.768668 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d4e2ed-7457-458a-9c76-dcf8f3aadd99" path="/var/lib/kubelet/pods/23d4e2ed-7457-458a-9c76-dcf8f3aadd99/volumes" Mar 13 14:46:07 crc kubenswrapper[4898]: I0313 14:46:07.941132 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2rfzq" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" containerID="cri-o://fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" gracePeriod=2 Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.569402 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.686693 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") pod \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.686775 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") pod \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.686800 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") pod \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\" (UID: \"c6f2443d-86d2-440c-8039-b04fb5eeeeb3\") " Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.687784 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities" (OuterVolumeSpecName: "utilities") pod "c6f2443d-86d2-440c-8039-b04fb5eeeeb3" (UID: "c6f2443d-86d2-440c-8039-b04fb5eeeeb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.693262 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh" (OuterVolumeSpecName: "kube-api-access-95sdh") pod "c6f2443d-86d2-440c-8039-b04fb5eeeeb3" (UID: "c6f2443d-86d2-440c-8039-b04fb5eeeeb3"). InnerVolumeSpecName "kube-api-access-95sdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.771116 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6f2443d-86d2-440c-8039-b04fb5eeeeb3" (UID: "c6f2443d-86d2-440c-8039-b04fb5eeeeb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.789375 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.789402 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.789412 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95sdh\" (UniqueName: \"kubernetes.io/projected/c6f2443d-86d2-440c-8039-b04fb5eeeeb3-kube-api-access-95sdh\") on node \"crc\" DevicePath \"\"" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960117 4898 generic.go:334] "Generic (PLEG): container finished" podID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" exitCode=0 Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960220 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2rfzq" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960212 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d"} Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960512 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2rfzq" event={"ID":"c6f2443d-86d2-440c-8039-b04fb5eeeeb3","Type":"ContainerDied","Data":"90b947e908accbf4ac70479773da53012e339d540d41d5d9eb2140bc0236f4bf"} Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.960544 4898 scope.go:117] "RemoveContainer" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" Mar 13 14:46:08 crc kubenswrapper[4898]: I0313 14:46:08.999832 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:08.999980 4898 scope.go:117] "RemoveContainer" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.012190 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2rfzq"] Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.037367 4898 scope.go:117] "RemoveContainer" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.132303 4898 scope.go:117] "RemoveContainer" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" Mar 13 14:46:09 crc kubenswrapper[4898]: E0313 14:46:09.132838 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d\": container with ID starting with fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d not found: ID does not exist" containerID="fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.132877 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d"} err="failed to get container status \"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d\": rpc error: code = NotFound desc = could not find container \"fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d\": container with ID starting with fe9f6664ec90867b4d3d9688a1bf7932bcc8ee4c22fa6c8ca8f930fb37179b6d not found: ID does not exist" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.132918 4898 scope.go:117] "RemoveContainer" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" Mar 13 14:46:09 crc kubenswrapper[4898]: E0313 14:46:09.133202 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057\": container with ID starting with 782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057 not found: ID does not exist" containerID="782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.133232 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057"} err="failed to get container status \"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057\": rpc error: code = NotFound desc = could not find container \"782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057\": container with ID starting with 782ae3224eb428acf74866a5aa992fcc458d6183436c34435e21bcbf134ec057 not found: ID does not exist" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.133249 4898 scope.go:117] "RemoveContainer" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" Mar 13 14:46:09 crc kubenswrapper[4898]: E0313 14:46:09.133559 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547\": container with ID starting with 77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547 not found: ID does not exist" containerID="77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.133587 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547"} err="failed to get container status \"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547\": rpc error: code = NotFound desc = could not find container \"77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547\": container with ID starting with 77f0e252c7f4f4d769c583e3fab1155192864e78322af8bd2a7b2a9258102547 not found: ID does not exist" Mar 13 14:46:09 crc kubenswrapper[4898]: I0313 14:46:09.766805 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" path="/var/lib/kubelet/pods/c6f2443d-86d2-440c-8039-b04fb5eeeeb3/volumes" Mar 13 14:46:12 crc kubenswrapper[4898]: I0313 14:46:12.741244 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:12 crc kubenswrapper[4898]: E0313 14:46:12.742237 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:25 crc kubenswrapper[4898]: I0313 14:46:25.740503 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:25 crc kubenswrapper[4898]: E0313 14:46:25.742235 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:40 crc kubenswrapper[4898]: I0313 14:46:40.740154 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:40 crc kubenswrapper[4898]: E0313 14:46:40.740941 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:41 crc kubenswrapper[4898]: I0313 14:46:41.527386 4898 scope.go:117] "RemoveContainer" containerID="a4b672dd5f62f7db5f72a2ba461417e4b17e1ad5affad388a08bfa992e5aa45e" Mar 13 14:46:51 crc kubenswrapper[4898]: I0313 14:46:51.740513 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:46:51 crc kubenswrapper[4898]: E0313 14:46:51.741565 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:46:58 crc kubenswrapper[4898]: I0313 14:46:58.678388 4898 generic.go:334] "Generic (PLEG): container finished" podID="9a62fd58-a586-4473-abfe-4e227cad9900" containerID="b68526abfbbd65229d0fa636d4985f81e76151523259cd0cea1b6513d33ed080" exitCode=0 Mar 13 14:46:58 crc kubenswrapper[4898]: I0313 14:46:58.678823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerDied","Data":"b68526abfbbd65229d0fa636d4985f81e76151523259cd0cea1b6513d33ed080"} Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.336566 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438383 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438722 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438766 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438829 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.438987 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.439076 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") pod \"9a62fd58-a586-4473-abfe-4e227cad9900\" (UID: \"9a62fd58-a586-4473-abfe-4e227cad9900\") " Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.444184 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.446829 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct" (OuterVolumeSpecName: "kube-api-access-vmwct") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "kube-api-access-vmwct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.474237 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.476707 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory" (OuterVolumeSpecName: "inventory") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.476781 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.500551 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.510663 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "9a62fd58-a586-4473-abfe-4e227cad9900" (UID: "9a62fd58-a586-4473-abfe-4e227cad9900"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542416 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542451 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmwct\" (UniqueName: \"kubernetes.io/projected/9a62fd58-a586-4473-abfe-4e227cad9900-kube-api-access-vmwct\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542463 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542474 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542484 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542493 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.542503 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a62fd58-a586-4473-abfe-4e227cad9900-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.708454 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" event={"ID":"9a62fd58-a586-4473-abfe-4e227cad9900","Type":"ContainerDied","Data":"eb475c12bddce2c97f158483e36ec6344049764d71c6a03112569a01e066938c"} Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.708511 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb475c12bddce2c97f158483e36ec6344049764d71c6a03112569a01e066938c" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.708607 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837195 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6"] Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837739 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerName="oc" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837755 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerName="oc" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837772 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837779 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837795 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-utilities" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837802 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-utilities" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837819 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-content" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837825 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="extract-content" Mar 13 14:47:00 crc kubenswrapper[4898]: E0313 14:47:00.837849 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a62fd58-a586-4473-abfe-4e227cad9900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.837857 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a62fd58-a586-4473-abfe-4e227cad9900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838114 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" containerName="oc" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838149 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a62fd58-a586-4473-abfe-4e227cad9900" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838167 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f2443d-86d2-440c-8039-b04fb5eeeeb3" containerName="registry-server" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.838984 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.841804 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.842152 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.842376 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.843394 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.844063 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.847896 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6"] Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.963694 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.963981 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964561 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964627 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964666 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964745 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:00 crc kubenswrapper[4898]: I0313 14:47:00.964927 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067000 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067090 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067143 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067292 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067324 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.067390 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.072757 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.073195 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.073401 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.074522 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.074607 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.074687 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.093881 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.169666 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:47:01 crc kubenswrapper[4898]: I0313 14:47:01.926674 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6"] Mar 13 14:47:02 crc kubenswrapper[4898]: I0313 14:47:02.736013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerStarted","Data":"3f68a521805ceaebc4381fbbd273e0827e9decfdbc53a23464118ffd0aba9594"} Mar 13 14:47:02 crc kubenswrapper[4898]: I0313 14:47:02.739427 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:02 crc kubenswrapper[4898]: E0313 14:47:02.739807 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:03 crc kubenswrapper[4898]: I0313 14:47:03.785061 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerStarted","Data":"fa89130040e3f48f6b09e015edf3ef67fd27d3f545d629365a77745abc0aef24"} Mar 13 14:47:03 crc kubenswrapper[4898]: I0313 14:47:03.812479 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" podStartSLOduration=3.2589876 podStartE2EDuration="3.812457792s" podCreationTimestamp="2026-03-13 14:47:00 +0000 UTC" firstStartedPulling="2026-03-13 14:47:01.9215601 +0000 UTC m=+3056.923148379" lastFinishedPulling="2026-03-13 14:47:02.475030292 +0000 UTC m=+3057.476618571" observedRunningTime="2026-03-13 14:47:03.798810435 +0000 UTC m=+3058.800398684" watchObservedRunningTime="2026-03-13 14:47:03.812457792 +0000 UTC m=+3058.814046041" Mar 13 14:47:14 crc kubenswrapper[4898]: I0313 14:47:14.740096 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:14 crc kubenswrapper[4898]: E0313 14:47:14.741220 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:27 crc kubenswrapper[4898]: I0313 14:47:27.763353 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:27 crc kubenswrapper[4898]: E0313 14:47:27.764614 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:38 crc kubenswrapper[4898]: I0313 14:47:38.741091 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:38 crc kubenswrapper[4898]: E0313 14:47:38.743856 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:47:51 crc kubenswrapper[4898]: I0313 14:47:51.740600 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:47:51 crc kubenswrapper[4898]: E0313 14:47:51.741763 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.168721 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.174293 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.183292 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.183680 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.183933 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.184079 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.256473 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"auto-csr-approver-29556888-gn68v\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.360049 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"auto-csr-approver-29556888-gn68v\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.380427 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"auto-csr-approver-29556888-gn68v\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:00 crc kubenswrapper[4898]: I0313 14:48:00.504824 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:01 crc kubenswrapper[4898]: I0313 14:48:01.008438 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:48:01 crc kubenswrapper[4898]: I0313 14:48:01.484655 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556888-gn68v" event={"ID":"2813f8b3-81f9-48a3-9a55-173dead5d7a7","Type":"ContainerStarted","Data":"6c71d25075e5d652a353986d1a9cd606119869eb24adf85caa486d10c30b2418"} Mar 13 14:48:03 crc kubenswrapper[4898]: I0313 14:48:03.514895 4898 generic.go:334] "Generic (PLEG): container finished" podID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerID="916c6a5f721d9ac559010e8682bf9fc9eec602069822f7f9c7c610579bc40a49" exitCode=0 Mar 13 14:48:03 crc kubenswrapper[4898]: I0313 14:48:03.515193 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556888-gn68v" event={"ID":"2813f8b3-81f9-48a3-9a55-173dead5d7a7","Type":"ContainerDied","Data":"916c6a5f721d9ac559010e8682bf9fc9eec602069822f7f9c7c610579bc40a49"} Mar 13 14:48:03 crc kubenswrapper[4898]: I0313 14:48:03.739837 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:03 crc kubenswrapper[4898]: E0313 14:48:03.740339 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.004221 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.087394 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") pod \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\" (UID: \"2813f8b3-81f9-48a3-9a55-173dead5d7a7\") " Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.094193 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk" (OuterVolumeSpecName: "kube-api-access-dh2vk") pod "2813f8b3-81f9-48a3-9a55-173dead5d7a7" (UID: "2813f8b3-81f9-48a3-9a55-173dead5d7a7"). InnerVolumeSpecName "kube-api-access-dh2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.190780 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2vk\" (UniqueName: \"kubernetes.io/projected/2813f8b3-81f9-48a3-9a55-173dead5d7a7-kube-api-access-dh2vk\") on node \"crc\" DevicePath \"\"" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.541351 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556888-gn68v" event={"ID":"2813f8b3-81f9-48a3-9a55-173dead5d7a7","Type":"ContainerDied","Data":"6c71d25075e5d652a353986d1a9cd606119869eb24adf85caa486d10c30b2418"} Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.541390 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c71d25075e5d652a353986d1a9cd606119869eb24adf85caa486d10c30b2418" Mar 13 14:48:05 crc kubenswrapper[4898]: I0313 14:48:05.541984 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556888-gn68v" Mar 13 14:48:06 crc kubenswrapper[4898]: I0313 14:48:06.486263 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:48:06 crc kubenswrapper[4898]: I0313 14:48:06.501332 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556882-6vdgp"] Mar 13 14:48:07 crc kubenswrapper[4898]: I0313 14:48:07.757859 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46dff21f-c9aa-443a-b1c7-988721788744" path="/var/lib/kubelet/pods/46dff21f-c9aa-443a-b1c7-988721788744/volumes" Mar 13 14:48:16 crc kubenswrapper[4898]: I0313 14:48:16.739612 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:16 crc kubenswrapper[4898]: E0313 14:48:16.740431 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:27 crc kubenswrapper[4898]: I0313 14:48:27.740130 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:27 crc kubenswrapper[4898]: E0313 14:48:27.741418 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:39 crc kubenswrapper[4898]: I0313 14:48:39.740364 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:39 crc kubenswrapper[4898]: E0313 14:48:39.741442 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:48:41 crc kubenswrapper[4898]: I0313 14:48:41.676969 4898 scope.go:117] "RemoveContainer" containerID="5dfdf7dc37e2c03d23dcf11c2bda6721f5e0189a55bee1c413509ed3a8808306" Mar 13 14:48:53 crc kubenswrapper[4898]: I0313 14:48:53.740857 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:48:53 crc kubenswrapper[4898]: E0313 14:48:53.742266 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:05 crc kubenswrapper[4898]: I0313 14:49:04.739785 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:05 crc kubenswrapper[4898]: E0313 14:49:04.741236 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:06 crc kubenswrapper[4898]: I0313 14:49:06.362307 4898 generic.go:334] "Generic (PLEG): container finished" podID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerID="fa89130040e3f48f6b09e015edf3ef67fd27d3f545d629365a77745abc0aef24" exitCode=0 Mar 13 14:49:06 crc kubenswrapper[4898]: I0313 14:49:06.362412 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerDied","Data":"fa89130040e3f48f6b09e015edf3ef67fd27d3f545d629365a77745abc0aef24"} Mar 13 14:49:07 crc kubenswrapper[4898]: I0313 14:49:07.951868 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053415 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053669 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053719 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053819 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053873 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.053933 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") pod \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\" (UID: \"5139c85e-1d3d-4fe7-94aa-8efde03b43e0\") " Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.059884 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.059917 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp" (OuterVolumeSpecName: "kube-api-access-lg5fp") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "kube-api-access-lg5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.084076 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.094993 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.096346 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.099833 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory" (OuterVolumeSpecName: "inventory") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.105087 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5139c85e-1d3d-4fe7-94aa-8efde03b43e0" (UID: "5139c85e-1d3d-4fe7-94aa-8efde03b43e0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157381 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157409 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157421 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157430 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg5fp\" (UniqueName: \"kubernetes.io/projected/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-kube-api-access-lg5fp\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157439 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157448 4898 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.157461 4898 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5139c85e-1d3d-4fe7-94aa-8efde03b43e0-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.394363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" event={"ID":"5139c85e-1d3d-4fe7-94aa-8efde03b43e0","Type":"ContainerDied","Data":"3f68a521805ceaebc4381fbbd273e0827e9decfdbc53a23464118ffd0aba9594"} Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.394713 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f68a521805ceaebc4381fbbd273e0827e9decfdbc53a23464118ffd0aba9594" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.394586 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.530750 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885"] Mar 13 14:49:08 crc kubenswrapper[4898]: E0313 14:49:08.531315 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531337 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: E0313 14:49:08.531366 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerName="oc" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531375 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerName="oc" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531637 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" containerName="oc" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.531664 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5139c85e-1d3d-4fe7-94aa-8efde03b43e0" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.535412 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.541162 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.541954 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.542576 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.545055 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zsddr" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.550273 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.555460 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885"] Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.568479 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.568818 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.568991 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.569139 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.569259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671182 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671270 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671368 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671458 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.671643 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.690987 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.691001 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.691194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.691204 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.694274 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"logging-edpm-deployment-openstack-edpm-ipam-tw885\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:08 crc kubenswrapper[4898]: I0313 14:49:08.854875 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:09 crc kubenswrapper[4898]: I0313 14:49:09.443128 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885"] Mar 13 14:49:09 crc kubenswrapper[4898]: W0313 14:49:09.447302 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefff948d_3073_4635_bc2c_2a8fc746c6b8.slice/crio-0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66 WatchSource:0}: Error finding container 0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66: Status 404 returned error can't find the container with id 0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66 Mar 13 14:49:09 crc kubenswrapper[4898]: I0313 14:49:09.452267 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:49:10 crc kubenswrapper[4898]: I0313 14:49:10.417553 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerStarted","Data":"56b8fdcde2f4108f5197df0d9ca2d30c280b616e640941f05782549d26cd35cd"} Mar 13 14:49:10 crc kubenswrapper[4898]: I0313 14:49:10.417992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerStarted","Data":"0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66"} Mar 13 14:49:10 crc kubenswrapper[4898]: I0313 14:49:10.460662 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" podStartSLOduration=1.9194945159999999 podStartE2EDuration="2.460647655s" podCreationTimestamp="2026-03-13 14:49:08 +0000 UTC" firstStartedPulling="2026-03-13 14:49:09.452016912 +0000 UTC m=+3184.453605141" lastFinishedPulling="2026-03-13 14:49:09.993170001 +0000 UTC m=+3184.994758280" observedRunningTime="2026-03-13 14:49:10.45397294 +0000 UTC m=+3185.455561179" watchObservedRunningTime="2026-03-13 14:49:10.460647655 +0000 UTC m=+3185.462235894" Mar 13 14:49:16 crc kubenswrapper[4898]: I0313 14:49:16.739952 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:16 crc kubenswrapper[4898]: E0313 14:49:16.741240 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:25 crc kubenswrapper[4898]: I0313 14:49:25.660244 4898 generic.go:334] "Generic (PLEG): container finished" podID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerID="56b8fdcde2f4108f5197df0d9ca2d30c280b616e640941f05782549d26cd35cd" exitCode=0 Mar 13 14:49:25 crc kubenswrapper[4898]: I0313 14:49:25.660365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerDied","Data":"56b8fdcde2f4108f5197df0d9ca2d30c280b616e640941f05782549d26cd35cd"} Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.239567 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412356 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412455 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412517 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412641 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.412763 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") pod \"efff948d-3073-4635-bc2c-2a8fc746c6b8\" (UID: \"efff948d-3073-4635-bc2c-2a8fc746c6b8\") " Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.422180 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk" (OuterVolumeSpecName: "kube-api-access-ww6vk") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "kube-api-access-ww6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.465136 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.467033 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.467915 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.475480 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory" (OuterVolumeSpecName: "inventory") pod "efff948d-3073-4635-bc2c-2a8fc746c6b8" (UID: "efff948d-3073-4635-bc2c-2a8fc746c6b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.516968 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww6vk\" (UniqueName: \"kubernetes.io/projected/efff948d-3073-4635-bc2c-2a8fc746c6b8-kube-api-access-ww6vk\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517010 4898 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517024 4898 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517040 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.517052 4898 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/efff948d-3073-4635-bc2c-2a8fc746c6b8-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.690488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" event={"ID":"efff948d-3073-4635-bc2c-2a8fc746c6b8","Type":"ContainerDied","Data":"0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66"} Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.690529 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e670a0eb159f93d979313e41f2d96d2d655eb9fa6cd260efb1423c9ebf3ac66" Mar 13 14:49:27 crc kubenswrapper[4898]: I0313 14:49:27.690557 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-tw885" Mar 13 14:49:29 crc kubenswrapper[4898]: I0313 14:49:29.740503 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:29 crc kubenswrapper[4898]: E0313 14:49:29.741615 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:42 crc kubenswrapper[4898]: I0313 14:49:42.739985 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:42 crc kubenswrapper[4898]: E0313 14:49:42.742769 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:49:56 crc kubenswrapper[4898]: I0313 14:49:56.741528 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:49:56 crc kubenswrapper[4898]: E0313 14:49:56.742775 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.172670 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:50:00 crc kubenswrapper[4898]: E0313 14:50:00.174311 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.174350 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.175169 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="efff948d-3073-4635-bc2c-2a8fc746c6b8" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.177310 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.184706 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.184973 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.185368 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.190317 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.341478 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"auto-csr-approver-29556890-vl72z\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.444954 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"auto-csr-approver-29556890-vl72z\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.487913 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"auto-csr-approver-29556890-vl72z\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:00 crc kubenswrapper[4898]: I0313 14:50:00.522172 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:01 crc kubenswrapper[4898]: I0313 14:50:01.032040 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:50:01 crc kubenswrapper[4898]: W0313 14:50:01.040037 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5795e677_fa9a_4235_9a30_a040ac18eebd.slice/crio-0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9 WatchSource:0}: Error finding container 0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9: Status 404 returned error can't find the container with id 0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9 Mar 13 14:50:01 crc kubenswrapper[4898]: I0313 14:50:01.141948 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerStarted","Data":"0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9"} Mar 13 14:50:03 crc kubenswrapper[4898]: I0313 14:50:03.186503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerStarted","Data":"5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b"} Mar 13 14:50:03 crc kubenswrapper[4898]: I0313 14:50:03.204569 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556890-vl72z" podStartSLOduration=1.6491670360000001 podStartE2EDuration="3.204546254s" podCreationTimestamp="2026-03-13 14:50:00 +0000 UTC" firstStartedPulling="2026-03-13 14:50:01.043833487 +0000 UTC m=+3236.045421726" lastFinishedPulling="2026-03-13 14:50:02.599212665 +0000 UTC m=+3237.600800944" observedRunningTime="2026-03-13 14:50:03.200631268 +0000 UTC m=+3238.202219517" watchObservedRunningTime="2026-03-13 14:50:03.204546254 +0000 UTC m=+3238.206134513" Mar 13 14:50:04 crc kubenswrapper[4898]: I0313 14:50:04.202014 4898 generic.go:334] "Generic (PLEG): container finished" podID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerID="5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b" exitCode=0 Mar 13 14:50:04 crc kubenswrapper[4898]: I0313 14:50:04.202150 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerDied","Data":"5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b"} Mar 13 14:50:05 crc kubenswrapper[4898]: I0313 14:50:05.991261 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.153634 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") pod \"5795e677-fa9a-4235-9a30-a040ac18eebd\" (UID: \"5795e677-fa9a-4235-9a30-a040ac18eebd\") " Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.161213 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx" (OuterVolumeSpecName: "kube-api-access-9pgxx") pod "5795e677-fa9a-4235-9a30-a040ac18eebd" (UID: "5795e677-fa9a-4235-9a30-a040ac18eebd"). InnerVolumeSpecName "kube-api-access-9pgxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.237191 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556890-vl72z" event={"ID":"5795e677-fa9a-4235-9a30-a040ac18eebd","Type":"ContainerDied","Data":"0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9"} Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.237247 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0940c7aea401f92f6b87b35355fc8bb2b356dd83a404368f92cf4cfa1c5617b9" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.237339 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556890-vl72z" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.261483 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pgxx\" (UniqueName: \"kubernetes.io/projected/5795e677-fa9a-4235-9a30-a040ac18eebd-kube-api-access-9pgxx\") on node \"crc\" DevicePath \"\"" Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.604287 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:50:06 crc kubenswrapper[4898]: I0313 14:50:06.624441 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556884-wvc75"] Mar 13 14:50:07 crc kubenswrapper[4898]: I0313 14:50:07.763568 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22d70d9e-a058-43a7-b692-19cd302d65ca" path="/var/lib/kubelet/pods/22d70d9e-a058-43a7-b692-19cd302d65ca/volumes" Mar 13 14:50:09 crc kubenswrapper[4898]: I0313 14:50:09.740588 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:50:09 crc kubenswrapper[4898]: E0313 14:50:09.742130 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:50:24 crc kubenswrapper[4898]: I0313 14:50:24.739980 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:50:25 crc kubenswrapper[4898]: I0313 14:50:25.591216 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46"} Mar 13 14:50:41 crc kubenswrapper[4898]: I0313 14:50:41.828189 4898 scope.go:117] "RemoveContainer" containerID="668ec92e00de6e908be7a4b238e021ba041b8ee50f571d510eea90e125398f41" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.203255 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:52:00 crc kubenswrapper[4898]: E0313 14:52:00.204385 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerName="oc" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.204401 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerName="oc" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.204718 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" containerName="oc" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.205772 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.211882 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.215259 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.215492 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.217759 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.316001 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"auto-csr-approver-29556892-gg8mm\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.419137 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"auto-csr-approver-29556892-gg8mm\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.446738 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"auto-csr-approver-29556892-gg8mm\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:00 crc kubenswrapper[4898]: I0313 14:52:00.560965 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:01 crc kubenswrapper[4898]: I0313 14:52:01.020398 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:52:01 crc kubenswrapper[4898]: I0313 14:52:01.336700 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" event={"ID":"bba89630-e09c-4d6d-b7c3-89aecad3889f","Type":"ContainerStarted","Data":"7d7b019ddbf4d1930bed24662580f143f3fdb04fdd12047fb71e8598751eaea3"} Mar 13 14:52:03 crc kubenswrapper[4898]: I0313 14:52:03.364260 4898 generic.go:334] "Generic (PLEG): container finished" podID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerID="168730dbf03823de4c3d331997904d8f68ac2eebf4c5ae0fee8108df4bd5aa88" exitCode=0 Mar 13 14:52:03 crc kubenswrapper[4898]: I0313 14:52:03.364488 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" event={"ID":"bba89630-e09c-4d6d-b7c3-89aecad3889f","Type":"ContainerDied","Data":"168730dbf03823de4c3d331997904d8f68ac2eebf4c5ae0fee8108df4bd5aa88"} Mar 13 14:52:04 crc kubenswrapper[4898]: I0313 14:52:04.922683 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.105200 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") pod \"bba89630-e09c-4d6d-b7c3-89aecad3889f\" (UID: \"bba89630-e09c-4d6d-b7c3-89aecad3889f\") " Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.112385 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2" (OuterVolumeSpecName: "kube-api-access-wt7z2") pod "bba89630-e09c-4d6d-b7c3-89aecad3889f" (UID: "bba89630-e09c-4d6d-b7c3-89aecad3889f"). InnerVolumeSpecName "kube-api-access-wt7z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.210228 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7z2\" (UniqueName: \"kubernetes.io/projected/bba89630-e09c-4d6d-b7c3-89aecad3889f-kube-api-access-wt7z2\") on node \"crc\" DevicePath \"\"" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.389527 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" event={"ID":"bba89630-e09c-4d6d-b7c3-89aecad3889f","Type":"ContainerDied","Data":"7d7b019ddbf4d1930bed24662580f143f3fdb04fdd12047fb71e8598751eaea3"} Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.389584 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7b019ddbf4d1930bed24662580f143f3fdb04fdd12047fb71e8598751eaea3" Mar 13 14:52:05 crc kubenswrapper[4898]: I0313 14:52:05.389632 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556892-gg8mm" Mar 13 14:52:06 crc kubenswrapper[4898]: I0313 14:52:06.006859 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:52:06 crc kubenswrapper[4898]: I0313 14:52:06.020492 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556886-vxgm7"] Mar 13 14:52:07 crc kubenswrapper[4898]: I0313 14:52:07.763442 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6f3996-1b26-4a53-8c2d-f74aa89ef944" path="/var/lib/kubelet/pods/7e6f3996-1b26-4a53-8c2d-f74aa89ef944/volumes" Mar 13 14:52:41 crc kubenswrapper[4898]: I0313 14:52:41.941354 4898 scope.go:117] "RemoveContainer" containerID="c50d215e83c79e2c4ba98ed3d21204bd33821f4c3e54e5173055aa94bae3e42e" Mar 13 14:52:49 crc kubenswrapper[4898]: I0313 14:52:49.134413 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:52:49 crc kubenswrapper[4898]: I0313 14:52:49.135052 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:53:19 crc kubenswrapper[4898]: I0313 14:53:19.134181 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:53:19 crc kubenswrapper[4898]: I0313 14:53:19.134622 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.134676 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.135294 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.135351 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.136133 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.136204 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46" gracePeriod=600 Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.698513 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46" exitCode=0 Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.698586 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46"} Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.699704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da"} Mar 13 14:53:49 crc kubenswrapper[4898]: I0313 14:53:49.699752 4898 scope.go:117] "RemoveContainer" containerID="3380a5581d362c5b3dc0ed7955d410611389b7b46492a3c8939f1604d630d7f5" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.163989 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 14:54:00 crc kubenswrapper[4898]: E0313 14:54:00.165582 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerName="oc" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.165779 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerName="oc" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.166408 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" containerName="oc" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.167883 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.170443 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.170613 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.173243 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.183572 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.243706 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"auto-csr-approver-29556894-4f5n2\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.346091 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"auto-csr-approver-29556894-4f5n2\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.370800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"auto-csr-approver-29556894-4f5n2\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:00 crc kubenswrapper[4898]: I0313 14:54:00.498309 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:01 crc kubenswrapper[4898]: I0313 14:54:01.038421 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 14:54:01 crc kubenswrapper[4898]: I0313 14:54:01.897677 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" event={"ID":"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8","Type":"ContainerStarted","Data":"9b2e72bacd3af95a2707b1008042c42f07b142a4c03b6c85b2fb65b533aed3e6"} Mar 13 14:54:02 crc kubenswrapper[4898]: I0313 14:54:02.907808 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerID="508333f96d89e6fb34c9fb0fe392b0bcdb91535cabc45655a886e6d88f90fef5" exitCode=0 Mar 13 14:54:02 crc kubenswrapper[4898]: I0313 14:54:02.907960 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" event={"ID":"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8","Type":"ContainerDied","Data":"508333f96d89e6fb34c9fb0fe392b0bcdb91535cabc45655a886e6d88f90fef5"} Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.395868 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.448206 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") pod \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\" (UID: \"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8\") " Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.467322 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w" (OuterVolumeSpecName: "kube-api-access-2jx9w") pod "d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" (UID: "d9bfc1e4-be1f-4495-a7de-2b4f94e901d8"). InnerVolumeSpecName "kube-api-access-2jx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.556178 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jx9w\" (UniqueName: \"kubernetes.io/projected/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8-kube-api-access-2jx9w\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.935885 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" event={"ID":"d9bfc1e4-be1f-4495-a7de-2b4f94e901d8","Type":"ContainerDied","Data":"9b2e72bacd3af95a2707b1008042c42f07b142a4c03b6c85b2fb65b533aed3e6"} Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.935981 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b2e72bacd3af95a2707b1008042c42f07b142a4c03b6c85b2fb65b533aed3e6" Mar 13 14:54:04 crc kubenswrapper[4898]: I0313 14:54:04.935999 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556894-4f5n2" Mar 13 14:54:05 crc kubenswrapper[4898]: I0313 14:54:05.467060 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:54:05 crc kubenswrapper[4898]: I0313 14:54:05.475326 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556888-gn68v"] Mar 13 14:54:05 crc kubenswrapper[4898]: I0313 14:54:05.765726 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2813f8b3-81f9-48a3-9a55-173dead5d7a7" path="/var/lib/kubelet/pods/2813f8b3-81f9-48a3-9a55-173dead5d7a7/volumes" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.803714 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:54:32 crc kubenswrapper[4898]: E0313 14:54:32.805012 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerName="oc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.805033 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerName="oc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.805319 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" containerName="oc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.807473 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.816167 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.991282 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.992065 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:32 crc kubenswrapper[4898]: I0313 14:54:32.992212 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094325 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094711 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094822 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.094980 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.095484 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.116833 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"redhat-operators-kg5bc\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.153858 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:33 crc kubenswrapper[4898]: I0313 14:54:33.663307 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.326834 4898 generic.go:334] "Generic (PLEG): container finished" podID="2447a834-934b-4e95-a373-2f98aa976716" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" exitCode=0 Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.326933 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922"} Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.327113 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerStarted","Data":"477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d"} Mar 13 14:54:34 crc kubenswrapper[4898]: I0313 14:54:34.329140 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 14:54:35 crc kubenswrapper[4898]: I0313 14:54:35.340531 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerStarted","Data":"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558"} Mar 13 14:54:39 crc kubenswrapper[4898]: I0313 14:54:39.392330 4898 generic.go:334] "Generic (PLEG): container finished" podID="2447a834-934b-4e95-a373-2f98aa976716" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" exitCode=0 Mar 13 14:54:39 crc kubenswrapper[4898]: I0313 14:54:39.392715 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558"} Mar 13 14:54:40 crc kubenswrapper[4898]: I0313 14:54:40.410391 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerStarted","Data":"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80"} Mar 13 14:54:40 crc kubenswrapper[4898]: I0313 14:54:40.444735 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kg5bc" podStartSLOduration=2.946857204 podStartE2EDuration="8.444713976s" podCreationTimestamp="2026-03-13 14:54:32 +0000 UTC" firstStartedPulling="2026-03-13 14:54:34.328856108 +0000 UTC m=+3509.330444347" lastFinishedPulling="2026-03-13 14:54:39.82671287 +0000 UTC m=+3514.828301119" observedRunningTime="2026-03-13 14:54:40.430205297 +0000 UTC m=+3515.431793546" watchObservedRunningTime="2026-03-13 14:54:40.444713976 +0000 UTC m=+3515.446302225" Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.107395 4898 scope.go:117] "RemoveContainer" containerID="916c6a5f721d9ac559010e8682bf9fc9eec602069822f7f9c7c610579bc40a49" Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.977125 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.980550 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:42 crc kubenswrapper[4898]: I0313 14:54:42.989503 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.154755 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.154813 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.167284 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.167359 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.167400 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.269647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.270070 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.270366 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.270160 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.271549 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.294725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"community-operators-4b5mh\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.307998 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:43 crc kubenswrapper[4898]: I0313 14:54:43.865049 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.206491 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kg5bc" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" probeResult="failure" output=< Mar 13 14:54:44 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:54:44 crc kubenswrapper[4898]: > Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.452923 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" exitCode=0 Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.453013 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9"} Mar 13 14:54:44 crc kubenswrapper[4898]: I0313 14:54:44.453263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerStarted","Data":"923a3f0dc6bb0acd6ca081740083507d0787a55d0525314456f8d61b9d354939"} Mar 13 14:54:45 crc kubenswrapper[4898]: I0313 14:54:45.467259 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerStarted","Data":"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8"} Mar 13 14:54:47 crc kubenswrapper[4898]: I0313 14:54:47.502651 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" exitCode=0 Mar 13 14:54:47 crc kubenswrapper[4898]: I0313 14:54:47.502745 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8"} Mar 13 14:54:48 crc kubenswrapper[4898]: I0313 14:54:48.516667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerStarted","Data":"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c"} Mar 13 14:54:48 crc kubenswrapper[4898]: I0313 14:54:48.542325 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4b5mh" podStartSLOduration=2.83453774 podStartE2EDuration="6.542304109s" podCreationTimestamp="2026-03-13 14:54:42 +0000 UTC" firstStartedPulling="2026-03-13 14:54:44.456064264 +0000 UTC m=+3519.457652513" lastFinishedPulling="2026-03-13 14:54:48.163830633 +0000 UTC m=+3523.165418882" observedRunningTime="2026-03-13 14:54:48.537307445 +0000 UTC m=+3523.538895704" watchObservedRunningTime="2026-03-13 14:54:48.542304109 +0000 UTC m=+3523.543892348" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.309187 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.309727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.383157 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:53 crc kubenswrapper[4898]: I0313 14:54:53.632418 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:54 crc kubenswrapper[4898]: I0313 14:54:54.251336 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kg5bc" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" probeResult="failure" output=< Mar 13 14:54:54 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 14:54:54 crc kubenswrapper[4898]: > Mar 13 14:54:54 crc kubenswrapper[4898]: I0313 14:54:54.545916 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:55 crc kubenswrapper[4898]: I0313 14:54:55.620436 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4b5mh" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" containerID="cri-o://5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" gracePeriod=2 Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.219380 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.393102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") pod \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.393542 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") pod \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.393681 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") pod \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\" (UID: \"cd506070-c0f3-404f-9d20-fe9dd29cb86d\") " Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.394886 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities" (OuterVolumeSpecName: "utilities") pod "cd506070-c0f3-404f-9d20-fe9dd29cb86d" (UID: "cd506070-c0f3-404f-9d20-fe9dd29cb86d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.406950 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj" (OuterVolumeSpecName: "kube-api-access-wcsnj") pod "cd506070-c0f3-404f-9d20-fe9dd29cb86d" (UID: "cd506070-c0f3-404f-9d20-fe9dd29cb86d"). InnerVolumeSpecName "kube-api-access-wcsnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.452678 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd506070-c0f3-404f-9d20-fe9dd29cb86d" (UID: "cd506070-c0f3-404f-9d20-fe9dd29cb86d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.496321 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcsnj\" (UniqueName: \"kubernetes.io/projected/cd506070-c0f3-404f-9d20-fe9dd29cb86d-kube-api-access-wcsnj\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.496550 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.496583 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd506070-c0f3-404f-9d20-fe9dd29cb86d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.631533 4898 generic.go:334] "Generic (PLEG): container finished" podID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" exitCode=0 Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632602 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c"} Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4b5mh" event={"ID":"cd506070-c0f3-404f-9d20-fe9dd29cb86d","Type":"ContainerDied","Data":"923a3f0dc6bb0acd6ca081740083507d0787a55d0525314456f8d61b9d354939"} Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632765 4898 scope.go:117] "RemoveContainer" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.632954 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4b5mh" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.678152 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.682421 4898 scope.go:117] "RemoveContainer" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.687913 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4b5mh"] Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.710070 4898 scope.go:117] "RemoveContainer" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.782792 4898 scope.go:117] "RemoveContainer" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" Mar 13 14:54:56 crc kubenswrapper[4898]: E0313 14:54:56.783302 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c\": container with ID starting with 5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c not found: ID does not exist" containerID="5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783383 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c"} err="failed to get container status \"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c\": rpc error: code = NotFound desc = could not find container \"5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c\": container with ID starting with 5dfed7035501ce3768e690b6d346aa74c4e01f07dd784c02a16c24bb8ef0bb4c not found: ID does not exist" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783409 4898 scope.go:117] "RemoveContainer" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" Mar 13 14:54:56 crc kubenswrapper[4898]: E0313 14:54:56.783700 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8\": container with ID starting with e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8 not found: ID does not exist" containerID="e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783718 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8"} err="failed to get container status \"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8\": rpc error: code = NotFound desc = could not find container \"e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8\": container with ID starting with e32e7cc94c1deeabc6f4817c936e65f006d0e3daea1cc97a8088c99bc717b8d8 not found: ID does not exist" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.783730 4898 scope.go:117] "RemoveContainer" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" Mar 13 14:54:56 crc kubenswrapper[4898]: E0313 14:54:56.785105 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9\": container with ID starting with b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9 not found: ID does not exist" containerID="b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9" Mar 13 14:54:56 crc kubenswrapper[4898]: I0313 14:54:56.785166 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9"} err="failed to get container status \"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9\": rpc error: code = NotFound desc = could not find container \"b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9\": container with ID starting with b2cddcf430a5d8bfc6ff84fa02a0b4ab44edbd4aad2d2e2fd193c7067b3a67f9 not found: ID does not exist" Mar 13 14:54:57 crc kubenswrapper[4898]: I0313 14:54:57.757747 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" path="/var/lib/kubelet/pods/cd506070-c0f3-404f-9d20-fe9dd29cb86d/volumes" Mar 13 14:55:03 crc kubenswrapper[4898]: I0313 14:55:03.224593 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:03 crc kubenswrapper[4898]: I0313 14:55:03.294152 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:04 crc kubenswrapper[4898]: I0313 14:55:04.007340 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:55:04 crc kubenswrapper[4898]: I0313 14:55:04.731013 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kg5bc" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" containerID="cri-o://89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" gracePeriod=2 Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.217096 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.273738 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") pod \"2447a834-934b-4e95-a373-2f98aa976716\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.273876 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") pod \"2447a834-934b-4e95-a373-2f98aa976716\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.279413 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn" (OuterVolumeSpecName: "kube-api-access-dfgvn") pod "2447a834-934b-4e95-a373-2f98aa976716" (UID: "2447a834-934b-4e95-a373-2f98aa976716"). InnerVolumeSpecName "kube-api-access-dfgvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.375881 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") pod \"2447a834-934b-4e95-a373-2f98aa976716\" (UID: \"2447a834-934b-4e95-a373-2f98aa976716\") " Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.376575 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfgvn\" (UniqueName: \"kubernetes.io/projected/2447a834-934b-4e95-a373-2f98aa976716-kube-api-access-dfgvn\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.376686 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities" (OuterVolumeSpecName: "utilities") pod "2447a834-934b-4e95-a373-2f98aa976716" (UID: "2447a834-934b-4e95-a373-2f98aa976716"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.415615 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2447a834-934b-4e95-a373-2f98aa976716" (UID: "2447a834-934b-4e95-a373-2f98aa976716"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.486410 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.486446 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2447a834-934b-4e95-a373-2f98aa976716-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.744594 4898 generic.go:334] "Generic (PLEG): container finished" podID="2447a834-934b-4e95-a373-2f98aa976716" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" exitCode=0 Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.749967 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kg5bc" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.760357 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80"} Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.760433 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kg5bc" event={"ID":"2447a834-934b-4e95-a373-2f98aa976716","Type":"ContainerDied","Data":"477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d"} Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.760480 4898 scope.go:117] "RemoveContainer" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.804184 4898 scope.go:117] "RemoveContainer" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.847497 4898 scope.go:117] "RemoveContainer" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.848664 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.862791 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kg5bc"] Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.896409 4898 scope.go:117] "RemoveContainer" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" Mar 13 14:55:05 crc kubenswrapper[4898]: E0313 14:55:05.896779 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80\": container with ID starting with 89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80 not found: ID does not exist" containerID="89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.896814 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80"} err="failed to get container status \"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80\": rpc error: code = NotFound desc = could not find container \"89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80\": container with ID starting with 89cb8536dc0d3dd98ed9fc9c57ad307ad0709d2f015c3a7da74c48cf64ad5a80 not found: ID does not exist" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.896837 4898 scope.go:117] "RemoveContainer" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" Mar 13 14:55:05 crc kubenswrapper[4898]: E0313 14:55:05.897068 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558\": container with ID starting with f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558 not found: ID does not exist" containerID="f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.897086 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558"} err="failed to get container status \"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558\": rpc error: code = NotFound desc = could not find container \"f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558\": container with ID starting with f6b89b05518578a7184c50d07ceb43b7188c31361928503d2b5c6491b1d4f558 not found: ID does not exist" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.897097 4898 scope.go:117] "RemoveContainer" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" Mar 13 14:55:05 crc kubenswrapper[4898]: E0313 14:55:05.898436 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922\": container with ID starting with 211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922 not found: ID does not exist" containerID="211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922" Mar 13 14:55:05 crc kubenswrapper[4898]: I0313 14:55:05.898456 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922"} err="failed to get container status \"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922\": rpc error: code = NotFound desc = could not find container \"211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922\": container with ID starting with 211c6e55c9c729300f3eeeaf49cc57df8d6100a6437b61cd51516a756fb14922 not found: ID does not exist" Mar 13 14:55:07 crc kubenswrapper[4898]: I0313 14:55:07.766595 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2447a834-934b-4e95-a373-2f98aa976716" path="/var/lib/kubelet/pods/2447a834-934b-4e95-a373-2f98aa976716/volumes" Mar 13 14:55:13 crc kubenswrapper[4898]: E0313 14:55:13.806561 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:14 crc kubenswrapper[4898]: E0313 14:55:14.997973 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:25 crc kubenswrapper[4898]: E0313 14:55:25.343364 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.507551 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.509743 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.509797 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.509883 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.509977 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510015 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510034 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510093 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510114 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="extract-content" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510140 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510157 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.510206 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510226 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="extract-utilities" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510861 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd506070-c0f3-404f-9d20-fe9dd29cb86d" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.510971 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2447a834-934b-4e95-a373-2f98aa976716" containerName="registry-server" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.515428 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.524216 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:28 crc kubenswrapper[4898]: E0313 14:55:28.594787 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.641496 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.641590 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.642044 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744152 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744279 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.744950 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.745243 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.765073 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"redhat-marketplace-sq9wv\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:28 crc kubenswrapper[4898]: I0313 14:55:28.838433 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:29 crc kubenswrapper[4898]: I0313 14:55:29.393652 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:30 crc kubenswrapper[4898]: I0313 14:55:30.113664 4898 generic.go:334] "Generic (PLEG): container finished" podID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" exitCode=0 Mar 13 14:55:30 crc kubenswrapper[4898]: I0313 14:55:30.113931 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10"} Mar 13 14:55:30 crc kubenswrapper[4898]: I0313 14:55:30.114073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerStarted","Data":"546d5d9a91a0f1ecdf4faca633fb37afd28045f8e171cb7c207f091fd2e86d03"} Mar 13 14:55:31 crc kubenswrapper[4898]: I0313 14:55:31.139788 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerStarted","Data":"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781"} Mar 13 14:55:32 crc kubenswrapper[4898]: I0313 14:55:32.158294 4898 generic.go:334] "Generic (PLEG): container finished" podID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" exitCode=0 Mar 13 14:55:32 crc kubenswrapper[4898]: I0313 14:55:32.158475 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781"} Mar 13 14:55:33 crc kubenswrapper[4898]: I0313 14:55:33.172818 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerStarted","Data":"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5"} Mar 13 14:55:33 crc kubenswrapper[4898]: I0313 14:55:33.196539 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sq9wv" podStartSLOduration=2.636702494 podStartE2EDuration="5.196519284s" podCreationTimestamp="2026-03-13 14:55:28 +0000 UTC" firstStartedPulling="2026-03-13 14:55:30.117844886 +0000 UTC m=+3565.119433165" lastFinishedPulling="2026-03-13 14:55:32.677661676 +0000 UTC m=+3567.679249955" observedRunningTime="2026-03-13 14:55:33.191301044 +0000 UTC m=+3568.192889343" watchObservedRunningTime="2026-03-13 14:55:33.196519284 +0000 UTC m=+3568.198107533" Mar 13 14:55:35 crc kubenswrapper[4898]: E0313 14:55:35.758741 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:36 crc kubenswrapper[4898]: E0313 14:55:36.533341 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:49182->38.102.83.201:43395: write tcp 38.102.83.201:49182->38.102.83.201:43395: write: broken pipe Mar 13 14:55:38 crc kubenswrapper[4898]: I0313 14:55:38.839225 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:38 crc kubenswrapper[4898]: I0313 14:55:38.839563 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:38 crc kubenswrapper[4898]: I0313 14:55:38.930371 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:39 crc kubenswrapper[4898]: I0313 14:55:39.355354 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:39 crc kubenswrapper[4898]: I0313 14:55:39.428000 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:41 crc kubenswrapper[4898]: I0313 14:55:41.291334 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sq9wv" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" containerID="cri-o://21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" gracePeriod=2 Mar 13 14:55:41 crc kubenswrapper[4898]: I0313 14:55:41.954568 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.085164 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") pod \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.085834 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") pod \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.085891 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") pod \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\" (UID: \"e00d9a59-c597-436d-ab88-9b3ecdf169f5\") " Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.087048 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities" (OuterVolumeSpecName: "utilities") pod "e00d9a59-c597-436d-ab88-9b3ecdf169f5" (UID: "e00d9a59-c597-436d-ab88-9b3ecdf169f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.087516 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.091742 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw" (OuterVolumeSpecName: "kube-api-access-5ghrw") pod "e00d9a59-c597-436d-ab88-9b3ecdf169f5" (UID: "e00d9a59-c597-436d-ab88-9b3ecdf169f5"). InnerVolumeSpecName "kube-api-access-5ghrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.112043 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e00d9a59-c597-436d-ab88-9b3ecdf169f5" (UID: "e00d9a59-c597-436d-ab88-9b3ecdf169f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.189875 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ghrw\" (UniqueName: \"kubernetes.io/projected/e00d9a59-c597-436d-ab88-9b3ecdf169f5-kube-api-access-5ghrw\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.189937 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e00d9a59-c597-436d-ab88-9b3ecdf169f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306737 4898 generic.go:334] "Generic (PLEG): container finished" podID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" exitCode=0 Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306812 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5"} Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306844 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sq9wv" event={"ID":"e00d9a59-c597-436d-ab88-9b3ecdf169f5","Type":"ContainerDied","Data":"546d5d9a91a0f1ecdf4faca633fb37afd28045f8e171cb7c207f091fd2e86d03"} Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.306867 4898 scope.go:117] "RemoveContainer" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.309190 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sq9wv" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.341320 4898 scope.go:117] "RemoveContainer" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.368306 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.378873 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sq9wv"] Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.391363 4898 scope.go:117] "RemoveContainer" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.467892 4898 scope.go:117] "RemoveContainer" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" Mar 13 14:55:42 crc kubenswrapper[4898]: E0313 14:55:42.468637 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5\": container with ID starting with 21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5 not found: ID does not exist" containerID="21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.468841 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5"} err="failed to get container status \"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5\": rpc error: code = NotFound desc = could not find container \"21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5\": container with ID starting with 21ed19fb8cfdb4df85570465c522a878e49aeb5e6db3b2179ff9095fee8f08b5 not found: ID does not exist" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.469107 4898 scope.go:117] "RemoveContainer" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" Mar 13 14:55:42 crc kubenswrapper[4898]: E0313 14:55:42.469969 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781\": container with ID starting with d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781 not found: ID does not exist" containerID="d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.470033 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781"} err="failed to get container status \"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781\": rpc error: code = NotFound desc = could not find container \"d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781\": container with ID starting with d6b2845ad80174809092c4cf53d62a918ba53bbba71ab284849d74f5adfcc781 not found: ID does not exist" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.470075 4898 scope.go:117] "RemoveContainer" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" Mar 13 14:55:42 crc kubenswrapper[4898]: E0313 14:55:42.470627 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10\": container with ID starting with 49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10 not found: ID does not exist" containerID="49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10" Mar 13 14:55:42 crc kubenswrapper[4898]: I0313 14:55:42.470811 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10"} err="failed to get container status \"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10\": rpc error: code = NotFound desc = could not find container \"49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10\": container with ID starting with 49ffd373366e4f8d986751fcd6a355e9af8440adbb1ba7ed2ca3faee6acbbb10 not found: ID does not exist" Mar 13 14:55:43 crc kubenswrapper[4898]: I0313 14:55:43.770807 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" path="/var/lib/kubelet/pods/e00d9a59-c597-436d-ab88-9b3ecdf169f5/volumes" Mar 13 14:55:43 crc kubenswrapper[4898]: E0313 14:55:43.814168 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:45 crc kubenswrapper[4898]: E0313 14:55:45.817798 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:48 crc kubenswrapper[4898]: E0313 14:55:48.104731 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:48 crc kubenswrapper[4898]: E0313 14:55:48.105236 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:49 crc kubenswrapper[4898]: I0313 14:55:49.135115 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:55:49 crc kubenswrapper[4898]: I0313 14:55:49.135566 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:55:56 crc kubenswrapper[4898]: E0313 14:55:56.223927 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache]" Mar 13 14:55:58 crc kubenswrapper[4898]: E0313 14:55:58.603521 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice/crio-477d5672b30264827c27894fe05766aedc024459c1dfcbc7f3e7ee7ffe289e7d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2447a834_934b_4e95_a373_2f98aa976716.slice\": RecentStats: unable to find data in memory cache]" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.157296 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 14:56:00 crc kubenswrapper[4898]: E0313 14:56:00.158041 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158055 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" Mar 13 14:56:00 crc kubenswrapper[4898]: E0313 14:56:00.158073 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-content" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158079 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-content" Mar 13 14:56:00 crc kubenswrapper[4898]: E0313 14:56:00.158092 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-utilities" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158098 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="extract-utilities" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.158315 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e00d9a59-c597-436d-ab88-9b3ecdf169f5" containerName="registry-server" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.159240 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.161143 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.162153 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.162353 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.171313 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.249031 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"auto-csr-approver-29556896-tbs8b\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.351716 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"auto-csr-approver-29556896-tbs8b\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.374113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"auto-csr-approver-29556896-tbs8b\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: I0313 14:56:00.508273 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:00 crc kubenswrapper[4898]: W0313 14:56:00.997062 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce9a8272_18eb_4001_a998_8e24fbe84593.slice/crio-e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f WatchSource:0}: Error finding container e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f: Status 404 returned error can't find the container with id e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f Mar 13 14:56:01 crc kubenswrapper[4898]: I0313 14:56:01.011626 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 14:56:01 crc kubenswrapper[4898]: I0313 14:56:01.623066 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerStarted","Data":"e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f"} Mar 13 14:56:02 crc kubenswrapper[4898]: I0313 14:56:02.655883 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerStarted","Data":"9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e"} Mar 13 14:56:02 crc kubenswrapper[4898]: I0313 14:56:02.685660 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" podStartSLOduration=1.623458466 podStartE2EDuration="2.685639527s" podCreationTimestamp="2026-03-13 14:56:00 +0000 UTC" firstStartedPulling="2026-03-13 14:56:00.999482122 +0000 UTC m=+3596.001070391" lastFinishedPulling="2026-03-13 14:56:02.061663173 +0000 UTC m=+3597.063251452" observedRunningTime="2026-03-13 14:56:02.673927027 +0000 UTC m=+3597.675515286" watchObservedRunningTime="2026-03-13 14:56:02.685639527 +0000 UTC m=+3597.687227776" Mar 13 14:56:03 crc kubenswrapper[4898]: I0313 14:56:03.674080 4898 generic.go:334] "Generic (PLEG): container finished" podID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerID="9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e" exitCode=0 Mar 13 14:56:03 crc kubenswrapper[4898]: I0313 14:56:03.674186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerDied","Data":"9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e"} Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.166134 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.296643 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") pod \"ce9a8272-18eb-4001-a998-8e24fbe84593\" (UID: \"ce9a8272-18eb-4001-a998-8e24fbe84593\") " Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.302247 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682" (OuterVolumeSpecName: "kube-api-access-vh682") pod "ce9a8272-18eb-4001-a998-8e24fbe84593" (UID: "ce9a8272-18eb-4001-a998-8e24fbe84593"). InnerVolumeSpecName "kube-api-access-vh682". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.401321 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh682\" (UniqueName: \"kubernetes.io/projected/ce9a8272-18eb-4001-a998-8e24fbe84593-kube-api-access-vh682\") on node \"crc\" DevicePath \"\"" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.701689 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" event={"ID":"ce9a8272-18eb-4001-a998-8e24fbe84593","Type":"ContainerDied","Data":"e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f"} Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.701729 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e99538b0e68903326ecd6d3b3eec3f5b8f18c544cc397a0d3d212e983bd1d18f" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.702104 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556896-tbs8b" Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.778321 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:56:05 crc kubenswrapper[4898]: I0313 14:56:05.794508 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556890-vl72z"] Mar 13 14:56:07 crc kubenswrapper[4898]: I0313 14:56:07.778304 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5795e677-fa9a-4235-9a30-a040ac18eebd" path="/var/lib/kubelet/pods/5795e677-fa9a-4235-9a30-a040ac18eebd/volumes" Mar 13 14:56:19 crc kubenswrapper[4898]: I0313 14:56:19.137006 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:56:19 crc kubenswrapper[4898]: I0313 14:56:19.137525 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:56:35 crc kubenswrapper[4898]: E0313 14:56:35.240469 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:59526->38.102.83.201:43395: write tcp 38.102.83.201:59526->38.102.83.201:43395: write: broken pipe Mar 13 14:56:42 crc kubenswrapper[4898]: I0313 14:56:42.281938 4898 scope.go:117] "RemoveContainer" containerID="5a392b03ee567f4c66c2416d5b9e2dae2dde45c43076100867e0284651bd3e3b" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.243927 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:56:45 crc kubenswrapper[4898]: E0313 14:56:45.245284 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerName="oc" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.245306 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerName="oc" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.245771 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" containerName="oc" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.250492 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.260753 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.313019 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.313185 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.313328 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415204 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415276 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415747 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.415889 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.438079 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"certified-operators-mlqzt\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:45 crc kubenswrapper[4898]: I0313 14:56:45.595011 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:46 crc kubenswrapper[4898]: I0313 14:56:46.115694 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:56:46 crc kubenswrapper[4898]: I0313 14:56:46.300912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerStarted","Data":"32e5424324d0692c123e1bbe46f2f60411b281878d1cfd5ff36cef1c344778f4"} Mar 13 14:56:47 crc kubenswrapper[4898]: I0313 14:56:47.317251 4898 generic.go:334] "Generic (PLEG): container finished" podID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" exitCode=0 Mar 13 14:56:47 crc kubenswrapper[4898]: I0313 14:56:47.317352 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a"} Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.134511 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.134843 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.134967 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.135991 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.136057 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" gracePeriod=600 Mar 13 14:56:49 crc kubenswrapper[4898]: E0313 14:56:49.307420 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.380623 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerStarted","Data":"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181"} Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.383794 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" exitCode=0 Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.383827 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da"} Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.383890 4898 scope.go:117] "RemoveContainer" containerID="14bad097122012834e04d7733e2c56b7cb22ecdcbbe4dbb2cc5f1822098d6e46" Mar 13 14:56:49 crc kubenswrapper[4898]: I0313 14:56:49.384852 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:56:49 crc kubenswrapper[4898]: E0313 14:56:49.385386 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:56:51 crc kubenswrapper[4898]: I0313 14:56:51.428411 4898 generic.go:334] "Generic (PLEG): container finished" podID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" exitCode=0 Mar 13 14:56:51 crc kubenswrapper[4898]: I0313 14:56:51.428583 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181"} Mar 13 14:56:52 crc kubenswrapper[4898]: I0313 14:56:52.448347 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerStarted","Data":"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be"} Mar 13 14:56:52 crc kubenswrapper[4898]: I0313 14:56:52.472772 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mlqzt" podStartSLOduration=2.930406217 podStartE2EDuration="7.472744163s" podCreationTimestamp="2026-03-13 14:56:45 +0000 UTC" firstStartedPulling="2026-03-13 14:56:47.320780239 +0000 UTC m=+3642.322368508" lastFinishedPulling="2026-03-13 14:56:51.863118185 +0000 UTC m=+3646.864706454" observedRunningTime="2026-03-13 14:56:52.471568334 +0000 UTC m=+3647.473156643" watchObservedRunningTime="2026-03-13 14:56:52.472744163 +0000 UTC m=+3647.474332442" Mar 13 14:56:55 crc kubenswrapper[4898]: I0313 14:56:55.596271 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:55 crc kubenswrapper[4898]: I0313 14:56:55.596887 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:56:55 crc kubenswrapper[4898]: I0313 14:56:55.673113 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:03 crc kubenswrapper[4898]: I0313 14:57:03.740574 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:03 crc kubenswrapper[4898]: E0313 14:57:03.741623 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:05 crc kubenswrapper[4898]: I0313 14:57:05.662464 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:05 crc kubenswrapper[4898]: I0313 14:57:05.721828 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:57:06 crc kubenswrapper[4898]: I0313 14:57:06.639197 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mlqzt" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" containerID="cri-o://48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" gracePeriod=2 Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.262508 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.322224 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") pod \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.322384 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") pod \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.322570 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") pod \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\" (UID: \"363c1a15-d4ba-4ed1-bb98-74e3998bc48a\") " Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.323724 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities" (OuterVolumeSpecName: "utilities") pod "363c1a15-d4ba-4ed1-bb98-74e3998bc48a" (UID: "363c1a15-d4ba-4ed1-bb98-74e3998bc48a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.324658 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.332079 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj" (OuterVolumeSpecName: "kube-api-access-z6ghj") pod "363c1a15-d4ba-4ed1-bb98-74e3998bc48a" (UID: "363c1a15-d4ba-4ed1-bb98-74e3998bc48a"). InnerVolumeSpecName "kube-api-access-z6ghj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.393250 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "363c1a15-d4ba-4ed1-bb98-74e3998bc48a" (UID: "363c1a15-d4ba-4ed1-bb98-74e3998bc48a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.427590 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.427642 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6ghj\" (UniqueName: \"kubernetes.io/projected/363c1a15-d4ba-4ed1-bb98-74e3998bc48a-kube-api-access-z6ghj\") on node \"crc\" DevicePath \"\"" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.659965 4898 generic.go:334] "Generic (PLEG): container finished" podID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" exitCode=0 Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660034 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be"} Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660056 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlqzt" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660087 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlqzt" event={"ID":"363c1a15-d4ba-4ed1-bb98-74e3998bc48a","Type":"ContainerDied","Data":"32e5424324d0692c123e1bbe46f2f60411b281878d1cfd5ff36cef1c344778f4"} Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.660116 4898 scope.go:117] "RemoveContainer" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.713750 4898 scope.go:117] "RemoveContainer" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.719295 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.761549 4898 scope.go:117] "RemoveContainer" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.764209 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mlqzt"] Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.806208 4898 scope.go:117] "RemoveContainer" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" Mar 13 14:57:07 crc kubenswrapper[4898]: E0313 14:57:07.806824 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be\": container with ID starting with 48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be not found: ID does not exist" containerID="48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.806874 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be"} err="failed to get container status \"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be\": rpc error: code = NotFound desc = could not find container \"48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be\": container with ID starting with 48df74a161994e898e755e8bb1dc7caa0dd9618eae1c67bb09d5fd7561e988be not found: ID does not exist" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.806925 4898 scope.go:117] "RemoveContainer" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" Mar 13 14:57:07 crc kubenswrapper[4898]: E0313 14:57:07.807465 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181\": container with ID starting with c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181 not found: ID does not exist" containerID="c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.807522 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181"} err="failed to get container status \"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181\": rpc error: code = NotFound desc = could not find container \"c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181\": container with ID starting with c2222921e4aa6a19737a0e506e57480b9be09544407d113c7cd9b13301b23181 not found: ID does not exist" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.807540 4898 scope.go:117] "RemoveContainer" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" Mar 13 14:57:07 crc kubenswrapper[4898]: E0313 14:57:07.807976 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a\": container with ID starting with c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a not found: ID does not exist" containerID="c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a" Mar 13 14:57:07 crc kubenswrapper[4898]: I0313 14:57:07.808044 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a"} err="failed to get container status \"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a\": rpc error: code = NotFound desc = could not find container \"c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a\": container with ID starting with c895d748e4986f8c69701e3f78590d274b25b62eab1c689d80a555d48d6e360a not found: ID does not exist" Mar 13 14:57:09 crc kubenswrapper[4898]: I0313 14:57:09.759218 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" path="/var/lib/kubelet/pods/363c1a15-d4ba-4ed1-bb98-74e3998bc48a/volumes" Mar 13 14:57:18 crc kubenswrapper[4898]: I0313 14:57:18.739475 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:18 crc kubenswrapper[4898]: E0313 14:57:18.740119 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:29 crc kubenswrapper[4898]: I0313 14:57:29.740736 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:29 crc kubenswrapper[4898]: E0313 14:57:29.741788 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:44 crc kubenswrapper[4898]: I0313 14:57:44.740689 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:44 crc kubenswrapper[4898]: E0313 14:57:44.742063 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:57:55 crc kubenswrapper[4898]: I0313 14:57:55.747157 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:57:55 crc kubenswrapper[4898]: E0313 14:57:55.747938 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.158616 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 14:58:00 crc kubenswrapper[4898]: E0313 14:58:00.160364 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-content" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.160395 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-content" Mar 13 14:58:00 crc kubenswrapper[4898]: E0313 14:58:00.160430 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.160465 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" Mar 13 14:58:00 crc kubenswrapper[4898]: E0313 14:58:00.160576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-utilities" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.160595 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="extract-utilities" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.161298 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="363c1a15-d4ba-4ed1-bb98-74e3998bc48a" containerName="registry-server" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.163136 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.165788 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.165944 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.167192 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.174505 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.295285 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"auto-csr-approver-29556898-zbc7v\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.398306 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"auto-csr-approver-29556898-zbc7v\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.423221 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"auto-csr-approver-29556898-zbc7v\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:00 crc kubenswrapper[4898]: I0313 14:58:00.489288 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:01 crc kubenswrapper[4898]: I0313 14:58:01.004057 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 14:58:01 crc kubenswrapper[4898]: I0313 14:58:01.337194 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" event={"ID":"2033726f-d64f-4989-8837-cec9738c8491","Type":"ContainerStarted","Data":"4ea7cf1fedb5b1dbfe5701f105fc54d08b3deae9c2a5bd6e70725abdf840b05d"} Mar 13 14:58:03 crc kubenswrapper[4898]: I0313 14:58:03.360725 4898 generic.go:334] "Generic (PLEG): container finished" podID="2033726f-d64f-4989-8837-cec9738c8491" containerID="e1ed7a0b1ccbf119e01b0fbbab72ef967cfc5a8fef5c4bc80afb9d7eff1e70f1" exitCode=0 Mar 13 14:58:03 crc kubenswrapper[4898]: I0313 14:58:03.360835 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" event={"ID":"2033726f-d64f-4989-8837-cec9738c8491","Type":"ContainerDied","Data":"e1ed7a0b1ccbf119e01b0fbbab72ef967cfc5a8fef5c4bc80afb9d7eff1e70f1"} Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.803030 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.815125 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") pod \"2033726f-d64f-4989-8837-cec9738c8491\" (UID: \"2033726f-d64f-4989-8837-cec9738c8491\") " Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.827671 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k" (OuterVolumeSpecName: "kube-api-access-25r7k") pod "2033726f-d64f-4989-8837-cec9738c8491" (UID: "2033726f-d64f-4989-8837-cec9738c8491"). InnerVolumeSpecName "kube-api-access-25r7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 14:58:04 crc kubenswrapper[4898]: I0313 14:58:04.928443 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25r7k\" (UniqueName: \"kubernetes.io/projected/2033726f-d64f-4989-8837-cec9738c8491-kube-api-access-25r7k\") on node \"crc\" DevicePath \"\"" Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.383704 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" event={"ID":"2033726f-d64f-4989-8837-cec9738c8491","Type":"ContainerDied","Data":"4ea7cf1fedb5b1dbfe5701f105fc54d08b3deae9c2a5bd6e70725abdf840b05d"} Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.384223 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ea7cf1fedb5b1dbfe5701f105fc54d08b3deae9c2a5bd6e70725abdf840b05d" Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.383761 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556898-zbc7v" Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.916846 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:58:05 crc kubenswrapper[4898]: I0313 14:58:05.934888 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556892-gg8mm"] Mar 13 14:58:06 crc kubenswrapper[4898]: I0313 14:58:06.828630 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:06 crc kubenswrapper[4898]: E0313 14:58:06.829228 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:07 crc kubenswrapper[4898]: I0313 14:58:07.756260 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bba89630-e09c-4d6d-b7c3-89aecad3889f" path="/var/lib/kubelet/pods/bba89630-e09c-4d6d-b7c3-89aecad3889f/volumes" Mar 13 14:58:17 crc kubenswrapper[4898]: I0313 14:58:17.740081 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:17 crc kubenswrapper[4898]: E0313 14:58:17.740821 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:30 crc kubenswrapper[4898]: I0313 14:58:30.740328 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:30 crc kubenswrapper[4898]: E0313 14:58:30.741134 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:42 crc kubenswrapper[4898]: I0313 14:58:42.456839 4898 scope.go:117] "RemoveContainer" containerID="168730dbf03823de4c3d331997904d8f68ac2eebf4c5ae0fee8108df4bd5aa88" Mar 13 14:58:45 crc kubenswrapper[4898]: I0313 14:58:45.746936 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:45 crc kubenswrapper[4898]: E0313 14:58:45.747697 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:58:57 crc kubenswrapper[4898]: I0313 14:58:57.740131 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:58:57 crc kubenswrapper[4898]: E0313 14:58:57.741064 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:11 crc kubenswrapper[4898]: I0313 14:59:11.739784 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:11 crc kubenswrapper[4898]: E0313 14:59:11.740673 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:24 crc kubenswrapper[4898]: I0313 14:59:24.740496 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:24 crc kubenswrapper[4898]: E0313 14:59:24.741554 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:38 crc kubenswrapper[4898]: I0313 14:59:38.740391 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:38 crc kubenswrapper[4898]: E0313 14:59:38.741314 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 14:59:49 crc kubenswrapper[4898]: I0313 14:59:49.740828 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 14:59:49 crc kubenswrapper[4898]: E0313 14:59:49.742545 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.157921 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:00:00 crc kubenswrapper[4898]: E0313 15:00:00.159325 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2033726f-d64f-4989-8837-cec9738c8491" containerName="oc" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.159351 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2033726f-d64f-4989-8837-cec9738c8491" containerName="oc" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.159812 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2033726f-d64f-4989-8837-cec9738c8491" containerName="oc" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.161289 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.167594 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.167642 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.169509 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.169830 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck"] Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.171295 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.172577 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.173208 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.228324 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234277 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234348 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234410 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"auto-csr-approver-29556900-qqdxv\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.234429 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.245371 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck"] Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336263 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"auto-csr-approver-29556900-qqdxv\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336310 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336473 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.336535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.338310 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.342360 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.357533 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"collect-profiles-29556900-gxlck\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.362914 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"auto-csr-approver-29556900-qqdxv\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.498817 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:00 crc kubenswrapper[4898]: I0313 15:00:00.513726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.017416 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.018261 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.168533 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck"] Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.740764 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:01 crc kubenswrapper[4898]: E0313 15:00:01.741600 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.880297 4898 generic.go:334] "Generic (PLEG): container finished" podID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerID="f85839693cc5404eb8f27038e41ba6dbfc5580a2e1e80c25856c6a29ee2aad6a" exitCode=0 Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.880408 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" event={"ID":"8a2c20e5-035e-42e8-9767-4caf8f6381f3","Type":"ContainerDied","Data":"f85839693cc5404eb8f27038e41ba6dbfc5580a2e1e80c25856c6a29ee2aad6a"} Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.880724 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" event={"ID":"8a2c20e5-035e-42e8-9767-4caf8f6381f3","Type":"ContainerStarted","Data":"c5820a7dca51c230241d9431be06a9a8a7d219e0fca2c08870b1e7964c87000b"} Mar 13 15:00:01 crc kubenswrapper[4898]: I0313 15:00:01.882405 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerStarted","Data":"95ca6919ce9ba093b6a1dac4b4c3eccd147c30c4300aae3001ff16e6ba188e48"} Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.336362 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.417309 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") pod \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.417380 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") pod \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.417503 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") pod \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\" (UID: \"8a2c20e5-035e-42e8-9767-4caf8f6381f3\") " Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.418586 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume" (OuterVolumeSpecName: "config-volume") pod "8a2c20e5-035e-42e8-9767-4caf8f6381f3" (UID: "8a2c20e5-035e-42e8-9767-4caf8f6381f3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.424051 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk" (OuterVolumeSpecName: "kube-api-access-vx8tk") pod "8a2c20e5-035e-42e8-9767-4caf8f6381f3" (UID: "8a2c20e5-035e-42e8-9767-4caf8f6381f3"). InnerVolumeSpecName "kube-api-access-vx8tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.425036 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8a2c20e5-035e-42e8-9767-4caf8f6381f3" (UID: "8a2c20e5-035e-42e8-9767-4caf8f6381f3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.520764 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8a2c20e5-035e-42e8-9767-4caf8f6381f3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.521064 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8a2c20e5-035e-42e8-9767-4caf8f6381f3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.521077 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8tk\" (UniqueName: \"kubernetes.io/projected/8a2c20e5-035e-42e8-9767-4caf8f6381f3-kube-api-access-vx8tk\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.909226 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" event={"ID":"8a2c20e5-035e-42e8-9767-4caf8f6381f3","Type":"ContainerDied","Data":"c5820a7dca51c230241d9431be06a9a8a7d219e0fca2c08870b1e7964c87000b"} Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.909264 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5820a7dca51c230241d9431be06a9a8a7d219e0fca2c08870b1e7964c87000b" Mar 13 15:00:03 crc kubenswrapper[4898]: I0313 15:00:03.909273 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556900-gxlck" Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.438019 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.452301 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556855-r5t9l"] Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.934372 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerStarted","Data":"cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b"} Mar 13 15:00:04 crc kubenswrapper[4898]: I0313 15:00:04.959146 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" podStartSLOduration=1.389308172 podStartE2EDuration="4.959126091s" podCreationTimestamp="2026-03-13 15:00:00 +0000 UTC" firstStartedPulling="2026-03-13 15:00:01.017206305 +0000 UTC m=+3836.018794534" lastFinishedPulling="2026-03-13 15:00:04.587024194 +0000 UTC m=+3839.588612453" observedRunningTime="2026-03-13 15:00:04.9457776 +0000 UTC m=+3839.947365849" watchObservedRunningTime="2026-03-13 15:00:04.959126091 +0000 UTC m=+3839.960714340" Mar 13 15:00:05 crc kubenswrapper[4898]: I0313 15:00:05.758816 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b296c2-5046-40b3-9fca-be350cf5de3e" path="/var/lib/kubelet/pods/d9b296c2-5046-40b3-9fca-be350cf5de3e/volumes" Mar 13 15:00:05 crc kubenswrapper[4898]: I0313 15:00:05.948250 4898 generic.go:334] "Generic (PLEG): container finished" podID="b0efa686-df70-493a-92dc-90db2ee67205" containerID="cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b" exitCode=0 Mar 13 15:00:05 crc kubenswrapper[4898]: I0313 15:00:05.948342 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerDied","Data":"cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b"} Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.378610 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.430163 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") pod \"b0efa686-df70-493a-92dc-90db2ee67205\" (UID: \"b0efa686-df70-493a-92dc-90db2ee67205\") " Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.453252 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp" (OuterVolumeSpecName: "kube-api-access-6xxdp") pod "b0efa686-df70-493a-92dc-90db2ee67205" (UID: "b0efa686-df70-493a-92dc-90db2ee67205"). InnerVolumeSpecName "kube-api-access-6xxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.533093 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xxdp\" (UniqueName: \"kubernetes.io/projected/b0efa686-df70-493a-92dc-90db2ee67205-kube-api-access-6xxdp\") on node \"crc\" DevicePath \"\"" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.976838 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" event={"ID":"b0efa686-df70-493a-92dc-90db2ee67205","Type":"ContainerDied","Data":"95ca6919ce9ba093b6a1dac4b4c3eccd147c30c4300aae3001ff16e6ba188e48"} Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.977144 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95ca6919ce9ba093b6a1dac4b4c3eccd147c30c4300aae3001ff16e6ba188e48" Mar 13 15:00:07 crc kubenswrapper[4898]: I0313 15:00:07.976929 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556900-qqdxv" Mar 13 15:00:08 crc kubenswrapper[4898]: I0313 15:00:08.030698 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 15:00:08 crc kubenswrapper[4898]: I0313 15:00:08.044097 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556894-4f5n2"] Mar 13 15:00:09 crc kubenswrapper[4898]: I0313 15:00:09.756258 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9bfc1e4-be1f-4495-a7de-2b4f94e901d8" path="/var/lib/kubelet/pods/d9bfc1e4-be1f-4495-a7de-2b4f94e901d8/volumes" Mar 13 15:00:14 crc kubenswrapper[4898]: I0313 15:00:14.742182 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:14 crc kubenswrapper[4898]: E0313 15:00:14.743312 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:29 crc kubenswrapper[4898]: I0313 15:00:29.740618 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:29 crc kubenswrapper[4898]: E0313 15:00:29.742197 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:40 crc kubenswrapper[4898]: I0313 15:00:40.739616 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:40 crc kubenswrapper[4898]: E0313 15:00:40.740928 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:00:42 crc kubenswrapper[4898]: I0313 15:00:42.564179 4898 scope.go:117] "RemoveContainer" containerID="508333f96d89e6fb34c9fb0fe392b0bcdb91535cabc45655a886e6d88f90fef5" Mar 13 15:00:42 crc kubenswrapper[4898]: I0313 15:00:42.636255 4898 scope.go:117] "RemoveContainer" containerID="183f0c268935ae6820699911fc0be58b4d0e93db5e614c9661b0f4b96dcc6afd" Mar 13 15:00:51 crc kubenswrapper[4898]: I0313 15:00:51.739843 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:00:51 crc kubenswrapper[4898]: E0313 15:00:51.740952 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.188270 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29556901-6pnrd"] Mar 13 15:01:00 crc kubenswrapper[4898]: E0313 15:01:00.190015 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerName="collect-profiles" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190052 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerName="collect-profiles" Mar 13 15:01:00 crc kubenswrapper[4898]: E0313 15:01:00.190113 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0efa686-df70-493a-92dc-90db2ee67205" containerName="oc" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190132 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0efa686-df70-493a-92dc-90db2ee67205" containerName="oc" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190651 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2c20e5-035e-42e8-9767-4caf8f6381f3" containerName="collect-profiles" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.190713 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0efa686-df70-493a-92dc-90db2ee67205" containerName="oc" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.192781 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.210189 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29556901-6pnrd"] Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298405 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298461 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298494 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.298939 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401277 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401327 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401352 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.401462 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.410113 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.410144 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.410814 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.424296 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"keystone-cron-29556901-6pnrd\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:00 crc kubenswrapper[4898]: I0313 15:01:00.528829 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.078614 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29556901-6pnrd"] Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.697440 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerStarted","Data":"79a68b94993c8c73e536b1353fccc0391854becd695923f33f632c92842c2cda"} Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.697749 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerStarted","Data":"940d6aadcfa81d36e29a2fddf2d6c3158db5b07defb4474691f44c67a62308c8"} Mar 13 15:01:01 crc kubenswrapper[4898]: I0313 15:01:01.771071 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29556901-6pnrd" podStartSLOduration=1.771047975 podStartE2EDuration="1.771047975s" podCreationTimestamp="2026-03-13 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:01:01.711774885 +0000 UTC m=+3896.713363144" watchObservedRunningTime="2026-03-13 15:01:01.771047975 +0000 UTC m=+3896.772636224" Mar 13 15:01:04 crc kubenswrapper[4898]: I0313 15:01:04.735612 4898 generic.go:334] "Generic (PLEG): container finished" podID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerID="79a68b94993c8c73e536b1353fccc0391854becd695923f33f632c92842c2cda" exitCode=0 Mar 13 15:01:04 crc kubenswrapper[4898]: I0313 15:01:04.735661 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerDied","Data":"79a68b94993c8c73e536b1353fccc0391854becd695923f33f632c92842c2cda"} Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.410418 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585745 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585825 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585917 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.585951 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") pod \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\" (UID: \"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0\") " Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.592171 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png" (OuterVolumeSpecName: "kube-api-access-r6png") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "kube-api-access-r6png". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.598037 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.625458 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.674511 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data" (OuterVolumeSpecName: "config-data") pod "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" (UID: "3d71da57-c929-47d7-89bd-8e4e3c7f3ca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689099 4898 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689135 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689145 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6png\" (UniqueName: \"kubernetes.io/projected/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-kube-api-access-r6png\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.689154 4898 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d71da57-c929-47d7-89bd-8e4e3c7f3ca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.739675 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:06 crc kubenswrapper[4898]: E0313 15:01:06.740215 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.786189 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29556901-6pnrd" event={"ID":"3d71da57-c929-47d7-89bd-8e4e3c7f3ca0","Type":"ContainerDied","Data":"940d6aadcfa81d36e29a2fddf2d6c3158db5b07defb4474691f44c67a62308c8"} Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.786290 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940d6aadcfa81d36e29a2fddf2d6c3158db5b07defb4474691f44c67a62308c8" Mar 13 15:01:06 crc kubenswrapper[4898]: I0313 15:01:06.786348 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29556901-6pnrd" Mar 13 15:01:17 crc kubenswrapper[4898]: I0313 15:01:17.741262 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:17 crc kubenswrapper[4898]: E0313 15:01:17.742163 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:29 crc kubenswrapper[4898]: I0313 15:01:29.741151 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:29 crc kubenswrapper[4898]: E0313 15:01:29.742492 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:43 crc kubenswrapper[4898]: I0313 15:01:43.740153 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:43 crc kubenswrapper[4898]: E0313 15:01:43.741274 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:01:56 crc kubenswrapper[4898]: I0313 15:01:56.739640 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:01:57 crc kubenswrapper[4898]: I0313 15:01:57.479926 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5"} Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.151776 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:02:00 crc kubenswrapper[4898]: E0313 15:02:00.153147 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerName="keystone-cron" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.153171 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerName="keystone-cron" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.153470 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d71da57-c929-47d7-89bd-8e4e3c7f3ca0" containerName="keystone-cron" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.154726 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.159205 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.159451 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.160842 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.171113 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.331238 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"auto-csr-approver-29556902-s4cg2\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.433782 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"auto-csr-approver-29556902-s4cg2\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.451956 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"auto-csr-approver-29556902-s4cg2\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:00 crc kubenswrapper[4898]: I0313 15:02:00.488594 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:01 crc kubenswrapper[4898]: I0313 15:02:01.050129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:02:01 crc kubenswrapper[4898]: W0313 15:02:01.055026 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d9d3379_4b7b_4263_aec1_10c06dc087e6.slice/crio-36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6 WatchSource:0}: Error finding container 36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6: Status 404 returned error can't find the container with id 36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6 Mar 13 15:02:01 crc kubenswrapper[4898]: I0313 15:02:01.520417 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" event={"ID":"6d9d3379-4b7b-4263-aec1-10c06dc087e6","Type":"ContainerStarted","Data":"36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6"} Mar 13 15:02:03 crc kubenswrapper[4898]: I0313 15:02:03.547168 4898 generic.go:334] "Generic (PLEG): container finished" podID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerID="e906ab6657ba9367ad2d982719bc7c2ae3c8f44a0ea2bd20f34b397b25a0a265" exitCode=0 Mar 13 15:02:03 crc kubenswrapper[4898]: I0313 15:02:03.547245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" event={"ID":"6d9d3379-4b7b-4263-aec1-10c06dc087e6","Type":"ContainerDied","Data":"e906ab6657ba9367ad2d982719bc7c2ae3c8f44a0ea2bd20f34b397b25a0a265"} Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.078260 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.170257 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") pod \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\" (UID: \"6d9d3379-4b7b-4263-aec1-10c06dc087e6\") " Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.196042 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm" (OuterVolumeSpecName: "kube-api-access-2zbzm") pod "6d9d3379-4b7b-4263-aec1-10c06dc087e6" (UID: "6d9d3379-4b7b-4263-aec1-10c06dc087e6"). InnerVolumeSpecName "kube-api-access-2zbzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.273775 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zbzm\" (UniqueName: \"kubernetes.io/projected/6d9d3379-4b7b-4263-aec1-10c06dc087e6-kube-api-access-2zbzm\") on node \"crc\" DevicePath \"\"" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.576152 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" event={"ID":"6d9d3379-4b7b-4263-aec1-10c06dc087e6","Type":"ContainerDied","Data":"36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6"} Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.576195 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36f61f81391e8819a4f39904fa835f78cd2cec2cabe581ad8c356235ff92f1e6" Mar 13 15:02:05 crc kubenswrapper[4898]: I0313 15:02:05.576223 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556902-s4cg2" Mar 13 15:02:06 crc kubenswrapper[4898]: I0313 15:02:06.163419 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 15:02:06 crc kubenswrapper[4898]: I0313 15:02:06.174074 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556896-tbs8b"] Mar 13 15:02:07 crc kubenswrapper[4898]: I0313 15:02:07.766027 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce9a8272-18eb-4001-a998-8e24fbe84593" path="/var/lib/kubelet/pods/ce9a8272-18eb-4001-a998-8e24fbe84593/volumes" Mar 13 15:02:42 crc kubenswrapper[4898]: I0313 15:02:42.767074 4898 scope.go:117] "RemoveContainer" containerID="9eb62522f49fe5dd1b7c8d52fa260ebf6b039aa00a9cd1719cb7535e8637b27e" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.162680 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:04:00 crc kubenswrapper[4898]: E0313 15:04:00.164688 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerName="oc" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.164782 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerName="oc" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.165186 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" containerName="oc" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.166227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.169400 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.169774 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.170029 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.181155 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.288083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"auto-csr-approver-29556904-69dtq\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.390518 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"auto-csr-approver-29556904-69dtq\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.411145 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"auto-csr-approver-29556904-69dtq\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:00 crc kubenswrapper[4898]: I0313 15:04:00.491676 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:01 crc kubenswrapper[4898]: I0313 15:04:01.045174 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:04:01 crc kubenswrapper[4898]: I0313 15:04:01.939877 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerStarted","Data":"008b1c64d4d8ea8f8815f261be80b2d3ddebe9bba1a289857558e6a036470c73"} Mar 13 15:04:02 crc kubenswrapper[4898]: I0313 15:04:02.951450 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerStarted","Data":"cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3"} Mar 13 15:04:02 crc kubenswrapper[4898]: I0313 15:04:02.976905 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556904-69dtq" podStartSLOduration=1.910269677 podStartE2EDuration="2.976879095s" podCreationTimestamp="2026-03-13 15:04:00 +0000 UTC" firstStartedPulling="2026-03-13 15:04:01.046379605 +0000 UTC m=+4076.047967854" lastFinishedPulling="2026-03-13 15:04:02.112989033 +0000 UTC m=+4077.114577272" observedRunningTime="2026-03-13 15:04:02.969954254 +0000 UTC m=+4077.971542493" watchObservedRunningTime="2026-03-13 15:04:02.976879095 +0000 UTC m=+4077.978467334" Mar 13 15:04:03 crc kubenswrapper[4898]: I0313 15:04:03.965592 4898 generic.go:334] "Generic (PLEG): container finished" podID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerID="cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3" exitCode=0 Mar 13 15:04:03 crc kubenswrapper[4898]: I0313 15:04:03.965823 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerDied","Data":"cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3"} Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.476365 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.526353 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") pod \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\" (UID: \"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0\") " Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.539504 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88" (OuterVolumeSpecName: "kube-api-access-qsz88") pod "a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" (UID: "a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0"). InnerVolumeSpecName "kube-api-access-qsz88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:04:05 crc kubenswrapper[4898]: I0313 15:04:05.630320 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsz88\" (UniqueName: \"kubernetes.io/projected/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0-kube-api-access-qsz88\") on node \"crc\" DevicePath \"\"" Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.021988 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556904-69dtq" event={"ID":"a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0","Type":"ContainerDied","Data":"008b1c64d4d8ea8f8815f261be80b2d3ddebe9bba1a289857558e6a036470c73"} Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.022058 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="008b1c64d4d8ea8f8815f261be80b2d3ddebe9bba1a289857558e6a036470c73" Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.022145 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556904-69dtq" Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.067008 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 15:04:06 crc kubenswrapper[4898]: I0313 15:04:06.079733 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556898-zbc7v"] Mar 13 15:04:07 crc kubenswrapper[4898]: I0313 15:04:07.761355 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2033726f-d64f-4989-8837-cec9738c8491" path="/var/lib/kubelet/pods/2033726f-d64f-4989-8837-cec9738c8491/volumes" Mar 13 15:04:19 crc kubenswrapper[4898]: I0313 15:04:19.134676 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:04:19 crc kubenswrapper[4898]: I0313 15:04:19.135320 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.362490 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:04:42 crc kubenswrapper[4898]: E0313 15:04:42.363463 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerName="oc" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.363479 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerName="oc" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.363777 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" containerName="oc" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.365799 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.377630 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.478104 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.478189 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.478209 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.579977 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.580064 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.580085 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.581849 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.581884 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.607087 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"redhat-operators-8hn92\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.703452 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:42 crc kubenswrapper[4898]: I0313 15:04:42.903940 4898 scope.go:117] "RemoveContainer" containerID="e1ed7a0b1ccbf119e01b0fbbab72ef967cfc5a8fef5c4bc80afb9d7eff1e70f1" Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.161915 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.497999 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerID="db7ae25c88961957319f284a4e8d7d972e2cbf22c41613242dfe639a76a9147c" exitCode=0 Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.498044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"db7ae25c88961957319f284a4e8d7d972e2cbf22c41613242dfe639a76a9147c"} Mar 13 15:04:43 crc kubenswrapper[4898]: I0313 15:04:43.498076 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerStarted","Data":"6879d80bc6035cd951f8f9969f1be4bdabbd3f8ba652e1cb44bfc180caa2b879"} Mar 13 15:04:44 crc kubenswrapper[4898]: I0313 15:04:44.509430 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerStarted","Data":"7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf"} Mar 13 15:04:49 crc kubenswrapper[4898]: I0313 15:04:49.135399 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:04:49 crc kubenswrapper[4898]: I0313 15:04:49.135992 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:04:50 crc kubenswrapper[4898]: I0313 15:04:50.574312 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerID="7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf" exitCode=0 Mar 13 15:04:50 crc kubenswrapper[4898]: I0313 15:04:50.574442 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf"} Mar 13 15:04:51 crc kubenswrapper[4898]: I0313 15:04:51.589615 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerStarted","Data":"3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e"} Mar 13 15:04:51 crc kubenswrapper[4898]: I0313 15:04:51.627757 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hn92" podStartSLOduration=2.017243809 podStartE2EDuration="9.627729488s" podCreationTimestamp="2026-03-13 15:04:42 +0000 UTC" firstStartedPulling="2026-03-13 15:04:43.499892184 +0000 UTC m=+4118.501480423" lastFinishedPulling="2026-03-13 15:04:51.110377863 +0000 UTC m=+4126.111966102" observedRunningTime="2026-03-13 15:04:51.609018966 +0000 UTC m=+4126.610607215" watchObservedRunningTime="2026-03-13 15:04:51.627729488 +0000 UTC m=+4126.629317747" Mar 13 15:04:52 crc kubenswrapper[4898]: I0313 15:04:52.703983 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:52 crc kubenswrapper[4898]: I0313 15:04:52.704250 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:04:53 crc kubenswrapper[4898]: I0313 15:04:53.761794 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8hn92" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" probeResult="failure" output=< Mar 13 15:04:53 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:04:53 crc kubenswrapper[4898]: > Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.594280 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.597690 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.619486 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.677777 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.678220 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.678799 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.759199 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.788985 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.789474 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.789867 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.794431 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.794585 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.836333 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.849016 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"community-operators-vqvc5\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:02 crc kubenswrapper[4898]: I0313 15:05:02.946210 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:03 crc kubenswrapper[4898]: I0313 15:05:03.699220 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:03 crc kubenswrapper[4898]: I0313 15:05:03.724561 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerStarted","Data":"736696e3e2832f751d0808f40db57e7195351948569adaeaac99ff9a6c9bc2af"} Mar 13 15:05:04 crc kubenswrapper[4898]: I0313 15:05:04.734568 4898 generic.go:334] "Generic (PLEG): container finished" podID="b810f672-a1b5-434f-a031-0044957eebda" containerID="047cf2dc69ea2109aa3c486de3d364b7f183dce862049d2fa79df29daa715c7f" exitCode=0 Mar 13 15:05:04 crc kubenswrapper[4898]: I0313 15:05:04.734740 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"047cf2dc69ea2109aa3c486de3d364b7f183dce862049d2fa79df29daa715c7f"} Mar 13 15:05:04 crc kubenswrapper[4898]: I0313 15:05:04.736822 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.154931 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.155420 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8hn92" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" containerID="cri-o://3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e" gracePeriod=2 Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.758324 4898 generic.go:334] "Generic (PLEG): container finished" podID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerID="3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e" exitCode=0 Mar 13 15:05:05 crc kubenswrapper[4898]: I0313 15:05:05.760563 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e"} Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.383419 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.478956 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") pod \"c2015d1c-2da3-472c-b07f-3544037bda7b\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.479059 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") pod \"c2015d1c-2da3-472c-b07f-3544037bda7b\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.479122 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") pod \"c2015d1c-2da3-472c-b07f-3544037bda7b\" (UID: \"c2015d1c-2da3-472c-b07f-3544037bda7b\") " Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.480937 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities" (OuterVolumeSpecName: "utilities") pod "c2015d1c-2da3-472c-b07f-3544037bda7b" (UID: "c2015d1c-2da3-472c-b07f-3544037bda7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.485851 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp" (OuterVolumeSpecName: "kube-api-access-rqlmp") pod "c2015d1c-2da3-472c-b07f-3544037bda7b" (UID: "c2015d1c-2da3-472c-b07f-3544037bda7b"). InnerVolumeSpecName "kube-api-access-rqlmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.581168 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqlmp\" (UniqueName: \"kubernetes.io/projected/c2015d1c-2da3-472c-b07f-3544037bda7b-kube-api-access-rqlmp\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.581421 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.620092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2015d1c-2da3-472c-b07f-3544037bda7b" (UID: "c2015d1c-2da3-472c-b07f-3544037bda7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.685009 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2015d1c-2da3-472c-b07f-3544037bda7b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.772073 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hn92" event={"ID":"c2015d1c-2da3-472c-b07f-3544037bda7b","Type":"ContainerDied","Data":"6879d80bc6035cd951f8f9969f1be4bdabbd3f8ba652e1cb44bfc180caa2b879"} Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.772131 4898 scope.go:117] "RemoveContainer" containerID="3cb68a76c0bb798950b1b0e18aeabaf2ca9ce4df1058a9db56fbaa44a5f14d1e" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.772290 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hn92" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.830062 4898 scope.go:117] "RemoveContainer" containerID="7da033f322da5fa1fa86a5370629e3354569e0540a2c73e0b1b083de2cd620cf" Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.843840 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.858876 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8hn92"] Mar 13 15:05:06 crc kubenswrapper[4898]: I0313 15:05:06.937070 4898 scope.go:117] "RemoveContainer" containerID="db7ae25c88961957319f284a4e8d7d972e2cbf22c41613242dfe639a76a9147c" Mar 13 15:05:07 crc kubenswrapper[4898]: I0313 15:05:07.763451 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" path="/var/lib/kubelet/pods/c2015d1c-2da3-472c-b07f-3544037bda7b/volumes" Mar 13 15:05:08 crc kubenswrapper[4898]: I0313 15:05:08.809250 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerStarted","Data":"fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc"} Mar 13 15:05:11 crc kubenswrapper[4898]: I0313 15:05:11.842999 4898 generic.go:334] "Generic (PLEG): container finished" podID="b810f672-a1b5-434f-a031-0044957eebda" containerID="fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc" exitCode=0 Mar 13 15:05:11 crc kubenswrapper[4898]: I0313 15:05:11.843605 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc"} Mar 13 15:05:13 crc kubenswrapper[4898]: I0313 15:05:13.866439 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerStarted","Data":"fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0"} Mar 13 15:05:13 crc kubenswrapper[4898]: I0313 15:05:13.890217 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqvc5" podStartSLOduration=4.073507514 podStartE2EDuration="11.890189844s" podCreationTimestamp="2026-03-13 15:05:02 +0000 UTC" firstStartedPulling="2026-03-13 15:05:04.736502268 +0000 UTC m=+4139.738090517" lastFinishedPulling="2026-03-13 15:05:12.553184598 +0000 UTC m=+4147.554772847" observedRunningTime="2026-03-13 15:05:13.885844817 +0000 UTC m=+4148.887433096" watchObservedRunningTime="2026-03-13 15:05:13.890189844 +0000 UTC m=+4148.891778113" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.134060 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.134735 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.134802 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.136195 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:05:19 crc kubenswrapper[4898]: I0313 15:05:19.136306 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5" gracePeriod=600 Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959295 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5" exitCode=0 Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959363 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5"} Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959757 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a"} Mar 13 15:05:20 crc kubenswrapper[4898]: I0313 15:05:20.959781 4898 scope.go:117] "RemoveContainer" containerID="e2755f8bb613c4ca067aa4bb55241b92843fd6eb20b3661f1bc3e09f0a4a33da" Mar 13 15:05:22 crc kubenswrapper[4898]: I0313 15:05:22.947112 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:22 crc kubenswrapper[4898]: I0313 15:05:22.947671 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:23 crc kubenswrapper[4898]: I0313 15:05:23.391372 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:23 crc kubenswrapper[4898]: I0313 15:05:23.475505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:23 crc kubenswrapper[4898]: I0313 15:05:23.657364 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:25 crc kubenswrapper[4898]: I0313 15:05:25.008245 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqvc5" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" containerID="cri-o://fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0" gracePeriod=2 Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.031450 4898 generic.go:334] "Generic (PLEG): container finished" podID="b810f672-a1b5-434f-a031-0044957eebda" containerID="fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0" exitCode=0 Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.031546 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0"} Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.321421 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.439577 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") pod \"b810f672-a1b5-434f-a031-0044957eebda\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.440061 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") pod \"b810f672-a1b5-434f-a031-0044957eebda\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.440095 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") pod \"b810f672-a1b5-434f-a031-0044957eebda\" (UID: \"b810f672-a1b5-434f-a031-0044957eebda\") " Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.440439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities" (OuterVolumeSpecName: "utilities") pod "b810f672-a1b5-434f-a031-0044957eebda" (UID: "b810f672-a1b5-434f-a031-0044957eebda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.442201 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.447439 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z" (OuterVolumeSpecName: "kube-api-access-lxv6z") pod "b810f672-a1b5-434f-a031-0044957eebda" (UID: "b810f672-a1b5-434f-a031-0044957eebda"). InnerVolumeSpecName "kube-api-access-lxv6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.504227 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b810f672-a1b5-434f-a031-0044957eebda" (UID: "b810f672-a1b5-434f-a031-0044957eebda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.543812 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b810f672-a1b5-434f-a031-0044957eebda-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:26 crc kubenswrapper[4898]: I0313 15:05:26.543851 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxv6z\" (UniqueName: \"kubernetes.io/projected/b810f672-a1b5-434f-a031-0044957eebda-kube-api-access-lxv6z\") on node \"crc\" DevicePath \"\"" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.048336 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqvc5" event={"ID":"b810f672-a1b5-434f-a031-0044957eebda","Type":"ContainerDied","Data":"736696e3e2832f751d0808f40db57e7195351948569adaeaac99ff9a6c9bc2af"} Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.048409 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqvc5" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.048677 4898 scope.go:117] "RemoveContainer" containerID="fa0139e782cd3c4bbf091849cf8ad406e30cf1ffee2cf32b66a02f6f8c182df0" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.079054 4898 scope.go:117] "RemoveContainer" containerID="fc635bce94b6959b5176764a3e3eed24166f3f7903704ae4383b8bd39ab0ccdc" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.112616 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.121295 4898 scope.go:117] "RemoveContainer" containerID="047cf2dc69ea2109aa3c486de3d364b7f183dce862049d2fa79df29daa715c7f" Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.141009 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqvc5"] Mar 13 15:05:27 crc kubenswrapper[4898]: I0313 15:05:27.760308 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b810f672-a1b5-434f-a031-0044957eebda" path="/var/lib/kubelet/pods/b810f672-a1b5-434f-a031-0044957eebda/volumes" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.162270 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163646 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163670 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163720 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163733 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-utilities" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163759 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163769 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163801 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163811 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="extract-content" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163834 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163844 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: E0313 15:06:00.163859 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.163869 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.164287 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="b810f672-a1b5-434f-a031-0044957eebda" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.164310 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2015d1c-2da3-472c-b07f-3544037bda7b" containerName="registry-server" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.165654 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.169242 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.172617 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.172686 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.181145 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.283524 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"auto-csr-approver-29556906-m4tq8\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:00 crc kubenswrapper[4898]: I0313 15:06:00.385683 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"auto-csr-approver-29556906-m4tq8\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:01 crc kubenswrapper[4898]: I0313 15:06:01.031803 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"auto-csr-approver-29556906-m4tq8\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:01 crc kubenswrapper[4898]: I0313 15:06:01.097597 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:01 crc kubenswrapper[4898]: I0313 15:06:01.656517 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:06:02 crc kubenswrapper[4898]: I0313 15:06:02.506132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerStarted","Data":"e8a5dc0ee47b3e7543d44305fe662f785fbd35a6c3446f7da2d5bd45d5d5b1a4"} Mar 13 15:06:03 crc kubenswrapper[4898]: I0313 15:06:03.518486 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerStarted","Data":"64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca"} Mar 13 15:06:03 crc kubenswrapper[4898]: I0313 15:06:03.548128 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" podStartSLOduration=2.653404021 podStartE2EDuration="3.548105435s" podCreationTimestamp="2026-03-13 15:06:00 +0000 UTC" firstStartedPulling="2026-03-13 15:06:01.676047617 +0000 UTC m=+4196.677635856" lastFinishedPulling="2026-03-13 15:06:02.570749031 +0000 UTC m=+4197.572337270" observedRunningTime="2026-03-13 15:06:03.535007802 +0000 UTC m=+4198.536596061" watchObservedRunningTime="2026-03-13 15:06:03.548105435 +0000 UTC m=+4198.549693684" Mar 13 15:06:04 crc kubenswrapper[4898]: I0313 15:06:04.533262 4898 generic.go:334] "Generic (PLEG): container finished" podID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerID="64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca" exitCode=0 Mar 13 15:06:04 crc kubenswrapper[4898]: I0313 15:06:04.533350 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerDied","Data":"64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca"} Mar 13 15:06:05 crc kubenswrapper[4898]: I0313 15:06:05.963830 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.018050 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") pod \"3ee94077-8dd9-4144-bab5-2abd9744fa01\" (UID: \"3ee94077-8dd9-4144-bab5-2abd9744fa01\") " Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.023453 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm" (OuterVolumeSpecName: "kube-api-access-wv6zm") pod "3ee94077-8dd9-4144-bab5-2abd9744fa01" (UID: "3ee94077-8dd9-4144-bab5-2abd9744fa01"). InnerVolumeSpecName "kube-api-access-wv6zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.120662 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv6zm\" (UniqueName: \"kubernetes.io/projected/3ee94077-8dd9-4144-bab5-2abd9744fa01-kube-api-access-wv6zm\") on node \"crc\" DevicePath \"\"" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.559225 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" event={"ID":"3ee94077-8dd9-4144-bab5-2abd9744fa01","Type":"ContainerDied","Data":"e8a5dc0ee47b3e7543d44305fe662f785fbd35a6c3446f7da2d5bd45d5d5b1a4"} Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.559579 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8a5dc0ee47b3e7543d44305fe662f785fbd35a6c3446f7da2d5bd45d5d5b1a4" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.559284 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556906-m4tq8" Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.617747 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:06:06 crc kubenswrapper[4898]: I0313 15:06:06.632215 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556900-qqdxv"] Mar 13 15:06:07 crc kubenswrapper[4898]: I0313 15:06:07.756115 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0efa686-df70-493a-92dc-90db2ee67205" path="/var/lib/kubelet/pods/b0efa686-df70-493a-92dc-90db2ee67205/volumes" Mar 13 15:06:26 crc kubenswrapper[4898]: E0313 15:06:26.480934 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:59418->38.102.83.201:43395: write tcp 38.102.83.201:59418->38.102.83.201:43395: write: connection reset by peer Mar 13 15:06:43 crc kubenswrapper[4898]: I0313 15:06:43.114369 4898 scope.go:117] "RemoveContainer" containerID="cc698733d50d55655a41f78a9335173ab8803d13e8dc8d8d6d62ee958bbac18b" Mar 13 15:07:03 crc kubenswrapper[4898]: E0313 15:07:03.315837 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:49748->38.102.83.201:43395: write tcp 38.102.83.201:49748->38.102.83.201:43395: write: broken pipe Mar 13 15:07:49 crc kubenswrapper[4898]: I0313 15:07:49.134304 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:07:49 crc kubenswrapper[4898]: I0313 15:07:49.135112 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.144751 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:08:00 crc kubenswrapper[4898]: E0313 15:08:00.146235 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerName="oc" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.146258 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerName="oc" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.146716 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" containerName="oc" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.148068 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.151073 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.151076 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.151085 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.155832 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.305386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"auto-csr-approver-29556908-vplt7\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.407320 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"auto-csr-approver-29556908-vplt7\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.436389 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"auto-csr-approver-29556908-vplt7\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.483067 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:00 crc kubenswrapper[4898]: I0313 15:08:00.993911 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:08:01 crc kubenswrapper[4898]: W0313 15:08:01.010510 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf9e42bc_c4a2_4ccc_ad85_5ca077abfd88.slice/crio-b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4 WatchSource:0}: Error finding container b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4: Status 404 returned error can't find the container with id b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4 Mar 13 15:08:01 crc kubenswrapper[4898]: I0313 15:08:01.044213 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-vplt7" event={"ID":"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88","Type":"ContainerStarted","Data":"b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4"} Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.287660 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.290345 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.311024 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.459014 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.459202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.459258 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.561264 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.561348 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.561401 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.564128 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.564528 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.584806 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"certified-operators-44hx4\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:02 crc kubenswrapper[4898]: I0313 15:08:02.614684 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:03 crc kubenswrapper[4898]: I0313 15:08:03.174566 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.084647 4898 generic.go:334] "Generic (PLEG): container finished" podID="4efa3f00-c382-4542-b865-48ff26f025ca" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" exitCode=0 Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.084709 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e"} Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.084991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerStarted","Data":"5640eeafae0e26e70bf1dca59cc0e1213919ac82acbc06ec1ff1facea138314b"} Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.089246 4898 generic.go:334] "Generic (PLEG): container finished" podID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerID="020c072c01677481578a21e99a6c39f8522847520765e8695da295955dd3e290" exitCode=0 Mar 13 15:08:04 crc kubenswrapper[4898]: I0313 15:08:04.089320 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-vplt7" event={"ID":"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88","Type":"ContainerDied","Data":"020c072c01677481578a21e99a6c39f8522847520765e8695da295955dd3e290"} Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.103618 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerStarted","Data":"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3"} Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.517020 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.707893 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") pod \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\" (UID: \"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88\") " Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.728762 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw" (OuterVolumeSpecName: "kube-api-access-4p8sw") pod "df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" (UID: "df9e42bc-c4a2-4ccc-ad85-5ca077abfd88"). InnerVolumeSpecName "kube-api-access-4p8sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:08:05 crc kubenswrapper[4898]: I0313 15:08:05.812698 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p8sw\" (UniqueName: \"kubernetes.io/projected/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88-kube-api-access-4p8sw\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.120313 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556908-vplt7" event={"ID":"df9e42bc-c4a2-4ccc-ad85-5ca077abfd88","Type":"ContainerDied","Data":"b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4"} Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.120600 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b0e247f2a8ff2707210a72a28118243a8690048dfda2f2b07df45c3e2a2ff4" Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.120365 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556908-vplt7" Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.628855 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:08:06 crc kubenswrapper[4898]: I0313 15:08:06.638619 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556902-s4cg2"] Mar 13 15:08:07 crc kubenswrapper[4898]: I0313 15:08:07.136346 4898 generic.go:334] "Generic (PLEG): container finished" podID="4efa3f00-c382-4542-b865-48ff26f025ca" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" exitCode=0 Mar 13 15:08:07 crc kubenswrapper[4898]: I0313 15:08:07.136653 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3"} Mar 13 15:08:07 crc kubenswrapper[4898]: I0313 15:08:07.766094 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9d3379-4b7b-4263-aec1-10c06dc087e6" path="/var/lib/kubelet/pods/6d9d3379-4b7b-4263-aec1-10c06dc087e6/volumes" Mar 13 15:08:08 crc kubenswrapper[4898]: I0313 15:08:08.162100 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerStarted","Data":"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe"} Mar 13 15:08:08 crc kubenswrapper[4898]: I0313 15:08:08.199827 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-44hx4" podStartSLOduration=2.6529622330000002 podStartE2EDuration="6.199800517s" podCreationTimestamp="2026-03-13 15:08:02 +0000 UTC" firstStartedPulling="2026-03-13 15:08:04.087385007 +0000 UTC m=+4319.088973246" lastFinishedPulling="2026-03-13 15:08:07.634223261 +0000 UTC m=+4322.635811530" observedRunningTime="2026-03-13 15:08:08.191319628 +0000 UTC m=+4323.192907887" watchObservedRunningTime="2026-03-13 15:08:08.199800517 +0000 UTC m=+4323.201388786" Mar 13 15:08:12 crc kubenswrapper[4898]: I0313 15:08:12.615154 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:12 crc kubenswrapper[4898]: I0313 15:08:12.617039 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:12 crc kubenswrapper[4898]: I0313 15:08:12.706695 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:13 crc kubenswrapper[4898]: I0313 15:08:13.318575 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:13 crc kubenswrapper[4898]: I0313 15:08:13.408435 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:15 crc kubenswrapper[4898]: I0313 15:08:15.283270 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-44hx4" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" containerID="cri-o://9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" gracePeriod=2 Mar 13 15:08:15 crc kubenswrapper[4898]: I0313 15:08:15.881769 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:15 crc kubenswrapper[4898]: I0313 15:08:15.999605 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") pod \"4efa3f00-c382-4542-b865-48ff26f025ca\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:15.999979 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") pod \"4efa3f00-c382-4542-b865-48ff26f025ca\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.000159 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") pod \"4efa3f00-c382-4542-b865-48ff26f025ca\" (UID: \"4efa3f00-c382-4542-b865-48ff26f025ca\") " Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.000938 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities" (OuterVolumeSpecName: "utilities") pod "4efa3f00-c382-4542-b865-48ff26f025ca" (UID: "4efa3f00-c382-4542-b865-48ff26f025ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.021737 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv" (OuterVolumeSpecName: "kube-api-access-sjkvv") pod "4efa3f00-c382-4542-b865-48ff26f025ca" (UID: "4efa3f00-c382-4542-b865-48ff26f025ca"). InnerVolumeSpecName "kube-api-access-sjkvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.082585 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4efa3f00-c382-4542-b865-48ff26f025ca" (UID: "4efa3f00-c382-4542-b865-48ff26f025ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.108065 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.108118 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjkvv\" (UniqueName: \"kubernetes.io/projected/4efa3f00-c382-4542-b865-48ff26f025ca-kube-api-access-sjkvv\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.108140 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4efa3f00-c382-4542-b865-48ff26f025ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300070 4898 generic.go:334] "Generic (PLEG): container finished" podID="4efa3f00-c382-4542-b865-48ff26f025ca" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" exitCode=0 Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300148 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe"} Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300274 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-44hx4" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-44hx4" event={"ID":"4efa3f00-c382-4542-b865-48ff26f025ca","Type":"ContainerDied","Data":"5640eeafae0e26e70bf1dca59cc0e1213919ac82acbc06ec1ff1facea138314b"} Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.300340 4898 scope.go:117] "RemoveContainer" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.337047 4898 scope.go:117] "RemoveContainer" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.358621 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.364549 4898 scope.go:117] "RemoveContainer" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.374453 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-44hx4"] Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.443341 4898 scope.go:117] "RemoveContainer" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" Mar 13 15:08:16 crc kubenswrapper[4898]: E0313 15:08:16.444088 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe\": container with ID starting with 9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe not found: ID does not exist" containerID="9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444128 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe"} err="failed to get container status \"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe\": rpc error: code = NotFound desc = could not find container \"9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe\": container with ID starting with 9778dc3a48d82eaf089015870cffa7f0eb568e8ab0d5030f6c4a2425e91df1fe not found: ID does not exist" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444154 4898 scope.go:117] "RemoveContainer" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" Mar 13 15:08:16 crc kubenswrapper[4898]: E0313 15:08:16.444583 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3\": container with ID starting with ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3 not found: ID does not exist" containerID="ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444606 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3"} err="failed to get container status \"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3\": rpc error: code = NotFound desc = could not find container \"ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3\": container with ID starting with ee44e53cd73fb17cb1edb64dad8d39faf007a6f2df230042e5915e9f66b123b3 not found: ID does not exist" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.444619 4898 scope.go:117] "RemoveContainer" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" Mar 13 15:08:16 crc kubenswrapper[4898]: E0313 15:08:16.445037 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e\": container with ID starting with 03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e not found: ID does not exist" containerID="03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e" Mar 13 15:08:16 crc kubenswrapper[4898]: I0313 15:08:16.445070 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e"} err="failed to get container status \"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e\": rpc error: code = NotFound desc = could not find container \"03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e\": container with ID starting with 03c63c680d08b74688a5886d67a5bd28ade094556ecdf764bb1054af23202d0e not found: ID does not exist" Mar 13 15:08:17 crc kubenswrapper[4898]: I0313 15:08:17.762856 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" path="/var/lib/kubelet/pods/4efa3f00-c382-4542-b865-48ff26f025ca/volumes" Mar 13 15:08:19 crc kubenswrapper[4898]: I0313 15:08:19.134438 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:08:19 crc kubenswrapper[4898]: I0313 15:08:19.134866 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:08:43 crc kubenswrapper[4898]: I0313 15:08:43.263797 4898 scope.go:117] "RemoveContainer" containerID="e906ab6657ba9367ad2d982719bc7c2ae3c8f44a0ea2bd20f34b397b25a0a265" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.134680 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.135332 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.135395 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.136548 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.136650 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" gracePeriod=600 Mar 13 15:08:49 crc kubenswrapper[4898]: E0313 15:08:49.270185 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.769864 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a"} Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.771209 4898 scope.go:117] "RemoveContainer" containerID="45a214eae7c7c7859e650d708f47b93bb385cc743f9e3e627bde30ab582e5ff5" Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.771649 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" exitCode=0 Mar 13 15:08:49 crc kubenswrapper[4898]: I0313 15:08:49.772111 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:08:49 crc kubenswrapper[4898]: E0313 15:08:49.773312 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:03 crc kubenswrapper[4898]: I0313 15:09:03.739823 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:03 crc kubenswrapper[4898]: E0313 15:09:03.740836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:14 crc kubenswrapper[4898]: I0313 15:09:14.739858 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:14 crc kubenswrapper[4898]: E0313 15:09:14.740881 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:25 crc kubenswrapper[4898]: I0313 15:09:25.751968 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:25 crc kubenswrapper[4898]: E0313 15:09:25.752979 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:38 crc kubenswrapper[4898]: I0313 15:09:38.740005 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:38 crc kubenswrapper[4898]: E0313 15:09:38.741398 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:09:50 crc kubenswrapper[4898]: I0313 15:09:50.740364 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:09:50 crc kubenswrapper[4898]: E0313 15:09:50.742672 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.149494 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150539 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150554 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150577 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150583 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-content" Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150625 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150632 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="extract-utilities" Mar 13 15:10:00 crc kubenswrapper[4898]: E0313 15:10:00.150639 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerName="oc" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150644 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerName="oc" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150849 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="4efa3f00-c382-4542-b865-48ff26f025ca" containerName="registry-server" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.150867 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" containerName="oc" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.151693 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.154129 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.154529 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.160066 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.162281 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.258363 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"auto-csr-approver-29556910-lcc5d\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.360995 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"auto-csr-approver-29556910-lcc5d\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.381725 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"auto-csr-approver-29556910-lcc5d\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.470017 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:00 crc kubenswrapper[4898]: I0313 15:10:00.970628 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:10:01 crc kubenswrapper[4898]: I0313 15:10:01.699577 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerStarted","Data":"64be457f6495c2066ecea207638f74ac02dc99b45865b08dd645709fbe9adcb6"} Mar 13 15:10:02 crc kubenswrapper[4898]: I0313 15:10:02.740558 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:02 crc kubenswrapper[4898]: E0313 15:10:02.741501 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:03 crc kubenswrapper[4898]: I0313 15:10:03.727608 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerStarted","Data":"ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387"} Mar 13 15:10:03 crc kubenswrapper[4898]: I0313 15:10:03.748652 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" podStartSLOduration=2.169812902 podStartE2EDuration="3.748627989s" podCreationTimestamp="2026-03-13 15:10:00 +0000 UTC" firstStartedPulling="2026-03-13 15:10:00.979025317 +0000 UTC m=+4435.980613556" lastFinishedPulling="2026-03-13 15:10:02.557840404 +0000 UTC m=+4437.559428643" observedRunningTime="2026-03-13 15:10:03.744684561 +0000 UTC m=+4438.746272800" watchObservedRunningTime="2026-03-13 15:10:03.748627989 +0000 UTC m=+4438.750216238" Mar 13 15:10:04 crc kubenswrapper[4898]: I0313 15:10:04.737244 4898 generic.go:334] "Generic (PLEG): container finished" podID="0a36f55a-ce22-4339-967f-906f473ddad5" containerID="ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387" exitCode=0 Mar 13 15:10:04 crc kubenswrapper[4898]: I0313 15:10:04.737289 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerDied","Data":"ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387"} Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.170455 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.203979 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") pod \"0a36f55a-ce22-4339-967f-906f473ddad5\" (UID: \"0a36f55a-ce22-4339-967f-906f473ddad5\") " Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.210004 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4" (OuterVolumeSpecName: "kube-api-access-zqhd4") pod "0a36f55a-ce22-4339-967f-906f473ddad5" (UID: "0a36f55a-ce22-4339-967f-906f473ddad5"). InnerVolumeSpecName "kube-api-access-zqhd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.308293 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqhd4\" (UniqueName: \"kubernetes.io/projected/0a36f55a-ce22-4339-967f-906f473ddad5-kube-api-access-zqhd4\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.771017 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" event={"ID":"0a36f55a-ce22-4339-967f-906f473ddad5","Type":"ContainerDied","Data":"64be457f6495c2066ecea207638f74ac02dc99b45865b08dd645709fbe9adcb6"} Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.771058 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64be457f6495c2066ecea207638f74ac02dc99b45865b08dd645709fbe9adcb6" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.771127 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556910-lcc5d" Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.843883 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:10:06 crc kubenswrapper[4898]: I0313 15:10:06.854388 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556904-69dtq"] Mar 13 15:10:07 crc kubenswrapper[4898]: I0313 15:10:07.753465 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0" path="/var/lib/kubelet/pods/a124d846-aa21-4e8a-bb0e-57cc2aa7a3e0/volumes" Mar 13 15:10:15 crc kubenswrapper[4898]: I0313 15:10:15.748141 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:15 crc kubenswrapper[4898]: E0313 15:10:15.749123 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.701961 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:23 crc kubenswrapper[4898]: E0313 15:10:23.703402 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" containerName="oc" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.703426 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" containerName="oc" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.703811 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" containerName="oc" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.706619 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.712551 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.860229 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.860275 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.860355 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.962251 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.962452 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.962481 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.963266 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.963545 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:23 crc kubenswrapper[4898]: I0313 15:10:23.987853 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"redhat-marketplace-dpqz2\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:24 crc kubenswrapper[4898]: I0313 15:10:24.035674 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:24 crc kubenswrapper[4898]: I0313 15:10:24.552225 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.002960 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerID="26a0efcf86b49360d3ca0f6db51f8be8241064695e81797fdbbea93b417ae346" exitCode=0 Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.003271 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"26a0efcf86b49360d3ca0f6db51f8be8241064695e81797fdbbea93b417ae346"} Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.003300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerStarted","Data":"6d58b2a80b4f25b60cebd9b2997bdbd7ee5cfa0f69c0a56fd8d0e8800a868b77"} Mar 13 15:10:25 crc kubenswrapper[4898]: I0313 15:10:25.005197 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:10:26 crc kubenswrapper[4898]: I0313 15:10:26.017382 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerStarted","Data":"1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627"} Mar 13 15:10:27 crc kubenswrapper[4898]: I0313 15:10:27.032079 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerID="1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627" exitCode=0 Mar 13 15:10:27 crc kubenswrapper[4898]: I0313 15:10:27.032376 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627"} Mar 13 15:10:29 crc kubenswrapper[4898]: I0313 15:10:29.055961 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerStarted","Data":"ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8"} Mar 13 15:10:29 crc kubenswrapper[4898]: I0313 15:10:29.080944 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dpqz2" podStartSLOduration=3.608435736 podStartE2EDuration="6.08092118s" podCreationTimestamp="2026-03-13 15:10:23 +0000 UTC" firstStartedPulling="2026-03-13 15:10:25.004951201 +0000 UTC m=+4460.006539440" lastFinishedPulling="2026-03-13 15:10:27.477436615 +0000 UTC m=+4462.479024884" observedRunningTime="2026-03-13 15:10:29.070617386 +0000 UTC m=+4464.072205645" watchObservedRunningTime="2026-03-13 15:10:29.08092118 +0000 UTC m=+4464.082509419" Mar 13 15:10:30 crc kubenswrapper[4898]: I0313 15:10:30.739834 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:30 crc kubenswrapper[4898]: E0313 15:10:30.740444 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.045636 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.046096 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.103917 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.170429 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:34 crc kubenswrapper[4898]: I0313 15:10:34.675004 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:36 crc kubenswrapper[4898]: I0313 15:10:36.131435 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dpqz2" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" containerID="cri-o://ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8" gracePeriod=2 Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144642 4898 generic.go:334] "Generic (PLEG): container finished" podID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerID="ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8" exitCode=0 Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8"} Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144958 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dpqz2" event={"ID":"6ae9befa-ba34-402d-9c68-4aad13ad380a","Type":"ContainerDied","Data":"6d58b2a80b4f25b60cebd9b2997bdbd7ee5cfa0f69c0a56fd8d0e8800a868b77"} Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.144979 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d58b2a80b4f25b60cebd9b2997bdbd7ee5cfa0f69c0a56fd8d0e8800a868b77" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.177661 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.287964 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") pod \"6ae9befa-ba34-402d-9c68-4aad13ad380a\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.288284 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") pod \"6ae9befa-ba34-402d-9c68-4aad13ad380a\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.288428 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") pod \"6ae9befa-ba34-402d-9c68-4aad13ad380a\" (UID: \"6ae9befa-ba34-402d-9c68-4aad13ad380a\") " Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.289667 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities" (OuterVolumeSpecName: "utilities") pod "6ae9befa-ba34-402d-9c68-4aad13ad380a" (UID: "6ae9befa-ba34-402d-9c68-4aad13ad380a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.300520 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8" (OuterVolumeSpecName: "kube-api-access-wcvz8") pod "6ae9befa-ba34-402d-9c68-4aad13ad380a" (UID: "6ae9befa-ba34-402d-9c68-4aad13ad380a"). InnerVolumeSpecName "kube-api-access-wcvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.312522 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ae9befa-ba34-402d-9c68-4aad13ad380a" (UID: "6ae9befa-ba34-402d-9c68-4aad13ad380a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.391026 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.391073 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvz8\" (UniqueName: \"kubernetes.io/projected/6ae9befa-ba34-402d-9c68-4aad13ad380a-kube-api-access-wcvz8\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:37 crc kubenswrapper[4898]: I0313 15:10:37.391090 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ae9befa-ba34-402d-9c68-4aad13ad380a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:10:38 crc kubenswrapper[4898]: I0313 15:10:38.155036 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dpqz2" Mar 13 15:10:38 crc kubenswrapper[4898]: I0313 15:10:38.184849 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:38 crc kubenswrapper[4898]: I0313 15:10:38.197365 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dpqz2"] Mar 13 15:10:39 crc kubenswrapper[4898]: I0313 15:10:39.759990 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" path="/var/lib/kubelet/pods/6ae9befa-ba34-402d-9c68-4aad13ad380a/volumes" Mar 13 15:10:43 crc kubenswrapper[4898]: I0313 15:10:43.422973 4898 scope.go:117] "RemoveContainer" containerID="cd5d0aba2324832ebc654e623ddb6b3b9935920ad75023c9232ebc2ff78ae2d3" Mar 13 15:10:45 crc kubenswrapper[4898]: I0313 15:10:45.761088 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:10:45 crc kubenswrapper[4898]: E0313 15:10:45.762415 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:00 crc kubenswrapper[4898]: I0313 15:11:00.739251 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:00 crc kubenswrapper[4898]: E0313 15:11:00.740005 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:11 crc kubenswrapper[4898]: I0313 15:11:11.739455 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:11 crc kubenswrapper[4898]: E0313 15:11:11.740180 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:24 crc kubenswrapper[4898]: I0313 15:11:24.741618 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:24 crc kubenswrapper[4898]: E0313 15:11:24.743103 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:36 crc kubenswrapper[4898]: I0313 15:11:36.739419 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:36 crc kubenswrapper[4898]: E0313 15:11:36.740326 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:11:51 crc kubenswrapper[4898]: I0313 15:11:51.743604 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:11:51 crc kubenswrapper[4898]: E0313 15:11:51.744578 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.154356 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:12:00 crc kubenswrapper[4898]: E0313 15:12:00.155576 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-utilities" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155596 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-utilities" Mar 13 15:12:00 crc kubenswrapper[4898]: E0313 15:12:00.155611 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-content" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155620 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="extract-content" Mar 13 15:12:00 crc kubenswrapper[4898]: E0313 15:12:00.155634 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155641 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.155868 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae9befa-ba34-402d-9c68-4aad13ad380a" containerName="registry-server" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.156650 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.167865 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.174130 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.174659 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.174949 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.208448 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"auto-csr-approver-29556912-5fmmb\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.313361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"auto-csr-approver-29556912-5fmmb\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.330522 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"auto-csr-approver-29556912-5fmmb\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:00 crc kubenswrapper[4898]: I0313 15:12:00.494803 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:01 crc kubenswrapper[4898]: I0313 15:12:01.035382 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:12:01 crc kubenswrapper[4898]: I0313 15:12:01.124089 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" event={"ID":"ada9e0ac-777e-4e64-aade-d729b4481edf","Type":"ContainerStarted","Data":"334b07d5a1c8b89023427a21780b8ae4851543bace4ad6dbdd28114bdfd3e287"} Mar 13 15:12:04 crc kubenswrapper[4898]: I0313 15:12:04.160000 4898 generic.go:334] "Generic (PLEG): container finished" podID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerID="ac390920dd30738022bc9982651d5e7fa6b628c845272ba7744c86bf9f8444e9" exitCode=0 Mar 13 15:12:04 crc kubenswrapper[4898]: I0313 15:12:04.160116 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" event={"ID":"ada9e0ac-777e-4e64-aade-d729b4481edf","Type":"ContainerDied","Data":"ac390920dd30738022bc9982651d5e7fa6b628c845272ba7744c86bf9f8444e9"} Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.722104 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.754052 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:05 crc kubenswrapper[4898]: E0313 15:12:05.754409 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.798978 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") pod \"ada9e0ac-777e-4e64-aade-d729b4481edf\" (UID: \"ada9e0ac-777e-4e64-aade-d729b4481edf\") " Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.806495 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292" (OuterVolumeSpecName: "kube-api-access-cs292") pod "ada9e0ac-777e-4e64-aade-d729b4481edf" (UID: "ada9e0ac-777e-4e64-aade-d729b4481edf"). InnerVolumeSpecName "kube-api-access-cs292". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:12:05 crc kubenswrapper[4898]: I0313 15:12:05.902655 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs292\" (UniqueName: \"kubernetes.io/projected/ada9e0ac-777e-4e64-aade-d729b4481edf-kube-api-access-cs292\") on node \"crc\" DevicePath \"\"" Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.189312 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" event={"ID":"ada9e0ac-777e-4e64-aade-d729b4481edf","Type":"ContainerDied","Data":"334b07d5a1c8b89023427a21780b8ae4851543bace4ad6dbdd28114bdfd3e287"} Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.189348 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334b07d5a1c8b89023427a21780b8ae4851543bace4ad6dbdd28114bdfd3e287" Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.189408 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556912-5fmmb" Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.826492 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:12:06 crc kubenswrapper[4898]: I0313 15:12:06.849070 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556906-m4tq8"] Mar 13 15:12:07 crc kubenswrapper[4898]: I0313 15:12:07.764884 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee94077-8dd9-4144-bab5-2abd9744fa01" path="/var/lib/kubelet/pods/3ee94077-8dd9-4144-bab5-2abd9744fa01/volumes" Mar 13 15:12:16 crc kubenswrapper[4898]: I0313 15:12:16.739889 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:16 crc kubenswrapper[4898]: E0313 15:12:16.741153 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.787454 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 15:12:22 crc kubenswrapper[4898]: E0313 15:12:22.788887 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerName="oc" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.788932 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerName="oc" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.789202 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" containerName="oc" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.790220 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.792782 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8v5gs" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.793011 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.794126 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.794159 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.823040 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836646 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836737 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836770 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.836876 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.837889 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838123 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838213 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838263 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.838349 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940370 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940431 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940449 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940513 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940591 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940647 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940712 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.940743 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.941757 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.942662 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.942773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.943278 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.945612 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.947800 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.947801 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.948630 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.978088 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:22 crc kubenswrapper[4898]: I0313 15:12:22.996863 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"tempest-tests-tempest\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " pod="openstack/tempest-tests-tempest" Mar 13 15:12:23 crc kubenswrapper[4898]: I0313 15:12:23.156961 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:12:23 crc kubenswrapper[4898]: I0313 15:12:23.677312 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 15:12:24 crc kubenswrapper[4898]: I0313 15:12:24.439725 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerStarted","Data":"afe41cfa21ca0ff15752a7557715bbecbe54855edbe2061b3b072582d6fab3b3"} Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.203869 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/8b/8b47586b9dc9859845a0009766c3842adb98f7625670e094cb04f6c938ff2e60?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151224Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b388b6763fa77f376478a9daff15dddc4e86f9a95b88eb81ecf62c8aceedcbcd®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1773415644~hmac=0abc8e338520ac5b8bf28d10fb829d398c4466399fe27974a04e18a24f6702d9\": remote error: tls: internal error" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.204454 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqc45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d19e8770-f0c1-491e-96c9-f737386ab3b0): ErrImagePull: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/8b/8b47586b9dc9859845a0009766c3842adb98f7625670e094cb04f6c938ff2e60?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151224Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b388b6763fa77f376478a9daff15dddc4e86f9a95b88eb81ecf62c8aceedcbcd®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1773415644~hmac=0abc8e338520ac5b8bf28d10fb829d398c4466399fe27974a04e18a24f6702d9\": remote error: tls: internal error" logger="UnhandledError" Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.205836 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/8b/8b47586b9dc9859845a0009766c3842adb98f7625670e094cb04f6c938ff2e60?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151224Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b388b6763fa77f376478a9daff15dddc4e86f9a95b88eb81ecf62c8aceedcbcd®ion=us-east-1&namespace=podified-antelope-centos9&username=openshift-release-dev+ocm_access_1b89217552bc42d1be3fb06a1aed001a&repo_name=openstack-tempest-all&akamai_signature=exp=1773415644~hmac=0abc8e338520ac5b8bf28d10fb829d398c4466399fe27974a04e18a24f6702d9\\\": remote error: tls: internal error\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:12:25 crc kubenswrapper[4898]: E0313 15:12:25.460151 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:12:31 crc kubenswrapper[4898]: I0313 15:12:31.739882 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:31 crc kubenswrapper[4898]: E0313 15:12:31.740967 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:43 crc kubenswrapper[4898]: I0313 15:12:43.556166 4898 scope.go:117] "RemoveContainer" containerID="64c72ba59c55becc3958f8ce26ee068bc7e3f6141dbbe5fe4984f9a120884dca" Mar 13 15:12:43 crc kubenswrapper[4898]: I0313 15:12:43.740363 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:43 crc kubenswrapper[4898]: E0313 15:12:43.741239 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:12:56 crc kubenswrapper[4898]: I0313 15:12:56.739869 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:12:56 crc kubenswrapper[4898]: E0313 15:12:56.740760 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:10 crc kubenswrapper[4898]: I0313 15:13:10.740534 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:10 crc kubenswrapper[4898]: E0313 15:13:10.742698 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:16 crc kubenswrapper[4898]: E0313 15:13:16.053589 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 15:13:16 crc kubenswrapper[4898]: E0313 15:13:16.054431 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mqc45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d19e8770-f0c1-491e-96c9-f737386ab3b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 15:13:16 crc kubenswrapper[4898]: E0313 15:13:16.055719 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:13:23 crc kubenswrapper[4898]: I0313 15:13:23.742169 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:23 crc kubenswrapper[4898]: E0313 15:13:23.744341 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:30 crc kubenswrapper[4898]: E0313 15:13:30.742513 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" Mar 13 15:13:37 crc kubenswrapper[4898]: I0313 15:13:37.740957 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:37 crc kubenswrapper[4898]: E0313 15:13:37.742434 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:13:43 crc kubenswrapper[4898]: I0313 15:13:43.207620 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 15:13:45 crc kubenswrapper[4898]: I0313 15:13:45.460317 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerStarted","Data":"f6562a9a91d72757d77dbf107969b6ab33c4a3a7219f9b57d0df0ee10184af60"} Mar 13 15:13:45 crc kubenswrapper[4898]: I0313 15:13:45.485391 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.965285636 podStartE2EDuration="1m24.485371067s" podCreationTimestamp="2026-03-13 15:12:21 +0000 UTC" firstStartedPulling="2026-03-13 15:12:23.684464892 +0000 UTC m=+4578.686053131" lastFinishedPulling="2026-03-13 15:13:43.204550313 +0000 UTC m=+4658.206138562" observedRunningTime="2026-03-13 15:13:45.48025744 +0000 UTC m=+4660.481845689" watchObservedRunningTime="2026-03-13 15:13:45.485371067 +0000 UTC m=+4660.486959306" Mar 13 15:13:50 crc kubenswrapper[4898]: I0313 15:13:50.739602 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:13:51 crc kubenswrapper[4898]: I0313 15:13:51.548352 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1"} Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.158409 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.162887 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.166614 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.166708 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.167207 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.176552 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.308225 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"auto-csr-approver-29556914-52q7k\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.411861 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"auto-csr-approver-29556914-52q7k\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.433682 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"auto-csr-approver-29556914-52q7k\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:00 crc kubenswrapper[4898]: I0313 15:14:00.500362 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:01 crc kubenswrapper[4898]: I0313 15:14:01.057075 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:14:01 crc kubenswrapper[4898]: I0313 15:14:01.687056 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerStarted","Data":"b6828d627192e989a1b2a59091f7130b2e5b82359ed7df4b9b0ac989f18cc295"} Mar 13 15:14:04 crc kubenswrapper[4898]: I0313 15:14:04.723434 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerStarted","Data":"7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2"} Mar 13 15:14:04 crc kubenswrapper[4898]: I0313 15:14:04.743831 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556914-52q7k" podStartSLOduration=3.502775568 podStartE2EDuration="4.743812719s" podCreationTimestamp="2026-03-13 15:14:00 +0000 UTC" firstStartedPulling="2026-03-13 15:14:01.061820743 +0000 UTC m=+4676.063408982" lastFinishedPulling="2026-03-13 15:14:02.302857894 +0000 UTC m=+4677.304446133" observedRunningTime="2026-03-13 15:14:04.742123111 +0000 UTC m=+4679.743711370" watchObservedRunningTime="2026-03-13 15:14:04.743812719 +0000 UTC m=+4679.745400958" Mar 13 15:14:05 crc kubenswrapper[4898]: I0313 15:14:05.738251 4898 generic.go:334] "Generic (PLEG): container finished" podID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerID="7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2" exitCode=0 Mar 13 15:14:05 crc kubenswrapper[4898]: I0313 15:14:05.738303 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerDied","Data":"7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2"} Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.196472 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.288292 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") pod \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\" (UID: \"ad898ac1-9e95-4eb8-a88b-927e3d6364f6\") " Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.298952 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz" (OuterVolumeSpecName: "kube-api-access-c5pcz") pod "ad898ac1-9e95-4eb8-a88b-927e3d6364f6" (UID: "ad898ac1-9e95-4eb8-a88b-927e3d6364f6"). InnerVolumeSpecName "kube-api-access-c5pcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.391686 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5pcz\" (UniqueName: \"kubernetes.io/projected/ad898ac1-9e95-4eb8-a88b-927e3d6364f6-kube-api-access-c5pcz\") on node \"crc\" DevicePath \"\"" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.762423 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556914-52q7k" event={"ID":"ad898ac1-9e95-4eb8-a88b-927e3d6364f6","Type":"ContainerDied","Data":"b6828d627192e989a1b2a59091f7130b2e5b82359ed7df4b9b0ac989f18cc295"} Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.762736 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6828d627192e989a1b2a59091f7130b2e5b82359ed7df4b9b0ac989f18cc295" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.762510 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556914-52q7k" Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.818617 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:14:07 crc kubenswrapper[4898]: I0313 15:14:07.828873 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556908-vplt7"] Mar 13 15:14:09 crc kubenswrapper[4898]: I0313 15:14:09.778027 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df9e42bc-c4a2-4ccc-ad85-5ca077abfd88" path="/var/lib/kubelet/pods/df9e42bc-c4a2-4ccc-ad85-5ca077abfd88/volumes" Mar 13 15:14:46 crc kubenswrapper[4898]: I0313 15:14:46.064668 4898 scope.go:117] "RemoveContainer" containerID="020c072c01677481578a21e99a6c39f8522847520765e8695da295955dd3e290" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.487638 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c"] Mar 13 15:15:00 crc kubenswrapper[4898]: E0313 15:15:00.492204 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.492233 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.494383 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" containerName="oc" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.501144 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.517197 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.517203 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.603747 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c"] Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.678919 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.679050 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.679083 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.804990 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.805234 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.805284 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.825140 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.845588 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.846954 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"collect-profiles-29556915-mwp4c\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:00 crc kubenswrapper[4898]: I0313 15:15:00.870870 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:02 crc kubenswrapper[4898]: I0313 15:15:02.355942 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c"] Mar 13 15:15:02 crc kubenswrapper[4898]: W0313 15:15:02.401929 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed7bf25_9e0b_4f13_9ff6_797cd1e6eb34.slice/crio-52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe WatchSource:0}: Error finding container 52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe: Status 404 returned error can't find the container with id 52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe Mar 13 15:15:03 crc kubenswrapper[4898]: I0313 15:15:03.397944 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerStarted","Data":"ba0567c7d801e11f8ad2230e5cbed9394ee6b7a10fbe11e35e978ddc27d03fd5"} Mar 13 15:15:03 crc kubenswrapper[4898]: I0313 15:15:03.398304 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerStarted","Data":"52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe"} Mar 13 15:15:03 crc kubenswrapper[4898]: I0313 15:15:03.433397 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" podStartSLOduration=3.431864052 podStartE2EDuration="3.431864052s" podCreationTimestamp="2026-03-13 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:15:03.414885976 +0000 UTC m=+4738.416474235" watchObservedRunningTime="2026-03-13 15:15:03.431864052 +0000 UTC m=+4738.433452291" Mar 13 15:15:04 crc kubenswrapper[4898]: I0313 15:15:04.410991 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerDied","Data":"ba0567c7d801e11f8ad2230e5cbed9394ee6b7a10fbe11e35e978ddc27d03fd5"} Mar 13 15:15:04 crc kubenswrapper[4898]: I0313 15:15:04.410852 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerID="ba0567c7d801e11f8ad2230e5cbed9394ee6b7a10fbe11e35e978ddc27d03fd5" exitCode=0 Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.005026 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.111849 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") pod \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.111958 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") pod \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.112230 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") pod \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\" (UID: \"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34\") " Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.119025 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume" (OuterVolumeSpecName: "config-volume") pod "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" (UID: "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.140968 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" (UID: "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.141132 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx" (OuterVolumeSpecName: "kube-api-access-mc9xx") pod "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" (UID: "9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34"). InnerVolumeSpecName "kube-api-access-mc9xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.215831 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.215874 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.215887 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc9xx\" (UniqueName: \"kubernetes.io/projected/9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34-kube-api-access-mc9xx\") on node \"crc\" DevicePath \"\"" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.449216 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" event={"ID":"9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34","Type":"ContainerDied","Data":"52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe"} Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.449272 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52306737f5569e6320dc7aa267aaca043c1d2a750fcb4919faad76c0c1ac15fe" Mar 13 15:15:07 crc kubenswrapper[4898]: I0313 15:15:07.449333 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556915-mwp4c" Mar 13 15:15:08 crc kubenswrapper[4898]: I0313 15:15:08.145876 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 15:15:08 crc kubenswrapper[4898]: I0313 15:15:08.158211 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556870-v7r2p"] Mar 13 15:15:09 crc kubenswrapper[4898]: I0313 15:15:09.763629 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19" path="/var/lib/kubelet/pods/924ed9fd-c9e5-4462-9b97-6d6cd1e8ea19/volumes" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.210958 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:15:23 crc kubenswrapper[4898]: E0313 15:15:23.216351 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerName="collect-profiles" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.216961 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerName="collect-profiles" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.221928 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ed7bf25-9e0b-4f13-9ff6-797cd1e6eb34" containerName="collect-profiles" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.230997 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.362522 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.416782 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.417386 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.417606 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.519955 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.520095 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.520163 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.532096 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.533257 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.588350 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"redhat-operators-dnh48\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:23 crc kubenswrapper[4898]: I0313 15:15:23.882074 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:15:25 crc kubenswrapper[4898]: I0313 15:15:25.441732 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:15:25 crc kubenswrapper[4898]: I0313 15:15:25.650282 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerStarted","Data":"ef80ef6757d050f773b7a3c8ba863e9ba495da2f0964e3cb0f243834ede62c6d"} Mar 13 15:15:26 crc kubenswrapper[4898]: I0313 15:15:26.665647 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"044e1dbede2644956e1b9f4f9606342f16b9eba6e865673613710b9380c20e93"} Mar 13 15:15:26 crc kubenswrapper[4898]: I0313 15:15:26.667522 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerID="044e1dbede2644956e1b9f4f9606342f16b9eba6e865673613710b9380c20e93" exitCode=0 Mar 13 15:15:26 crc kubenswrapper[4898]: I0313 15:15:26.675025 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.145339 4898 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/10/108b2a0c594729551b6547de5641f59a94e5b0352f4a4d63f0b44f8449d70766?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151527Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b09ad762917e163a6364894c5fc8dcbe38c3902f85afae09b73a1acab6f6d5c9®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1773415827~hmac=a37d420feaa2bf0534a876f9537b59caa67bd4af10f90723d14594be2e64147a\": remote error: tls: internal error" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.147766 4898 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mnffg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dnh48_openshift-marketplace(cb46c8b0-a6a9-4b6d-86a1-8408793887e5): ErrImagePull: copying system image from manifest list: parsing image configuration: Get \"https://cdn01.quay.io/quayio-production-s3/sha256/10/108b2a0c594729551b6547de5641f59a94e5b0352f4a4d63f0b44f8449d70766?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151527Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b09ad762917e163a6364894c5fc8dcbe38c3902f85afae09b73a1acab6f6d5c9®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1773415827~hmac=a37d420feaa2bf0534a876f9537b59caa67bd4af10f90723d14594be2e64147a\": remote error: tls: internal error" logger="UnhandledError" Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.149552 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: parsing image configuration: Get \\\"https://cdn01.quay.io/quayio-production-s3/sha256/10/108b2a0c594729551b6547de5641f59a94e5b0352f4a4d63f0b44f8449d70766?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIATAAF2YHTGR23ZTE6%2F20260313%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20260313T151527Z&X-Amz-Expires=600&X-Amz-SignedHeaders=host&X-Amz-Signature=b09ad762917e163a6364894c5fc8dcbe38c3902f85afae09b73a1acab6f6d5c9®ion=us-east-1&namespace=redhat-prod&username=redhat-prod+registry_proxy&repo_name=redhat----redhat-operator-index&akamai_signature=exp=1773415827~hmac=a37d420feaa2bf0534a876f9537b59caa67bd4af10f90723d14594be2e64147a\\\": remote error: tls: internal error\"" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" Mar 13 15:15:28 crc kubenswrapper[4898]: E0313 15:15:28.687456 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.724824 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.732768 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.793049 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.793202 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.793387 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.826200 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.895239 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.895428 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.895552 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.905019 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.907758 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:29 crc kubenswrapper[4898]: I0313 15:15:29.942742 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"community-operators-jndb5\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:30 crc kubenswrapper[4898]: I0313 15:15:30.069401 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:30 crc kubenswrapper[4898]: I0313 15:15:30.818176 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:15:31 crc kubenswrapper[4898]: I0313 15:15:31.720557 4898 generic.go:334] "Generic (PLEG): container finished" podID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" exitCode=0 Mar 13 15:15:31 crc kubenswrapper[4898]: I0313 15:15:31.720778 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22"} Mar 13 15:15:31 crc kubenswrapper[4898]: I0313 15:15:31.720979 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerStarted","Data":"ad1f1863b9e188c98c1dfb8a88d0df8d02f680dd682644703708970a9e0dc172"} Mar 13 15:15:33 crc kubenswrapper[4898]: I0313 15:15:33.758999 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerStarted","Data":"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43"} Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.536883 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.541980 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.622178 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.622219 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.726078 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:38 crc kubenswrapper[4898]: I0313 15:15:38.726087 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:41 crc kubenswrapper[4898]: I0313 15:15:41.885970 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43"} Mar 13 15:15:41 crc kubenswrapper[4898]: I0313 15:15:41.883355 4898 generic.go:334] "Generic (PLEG): container finished" podID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" exitCode=0 Mar 13 15:15:43 crc kubenswrapper[4898]: I0313 15:15:43.938375 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerStarted","Data":"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd"} Mar 13 15:15:43 crc kubenswrapper[4898]: I0313 15:15:43.992828 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jndb5" podStartSLOduration=3.705539478 podStartE2EDuration="14.979633626s" podCreationTimestamp="2026-03-13 15:15:29 +0000 UTC" firstStartedPulling="2026-03-13 15:15:31.722142087 +0000 UTC m=+4766.723730326" lastFinishedPulling="2026-03-13 15:15:42.996236235 +0000 UTC m=+4777.997824474" observedRunningTime="2026-03-13 15:15:43.975147514 +0000 UTC m=+4778.976735763" watchObservedRunningTime="2026-03-13 15:15:43.979633626 +0000 UTC m=+4778.981221865" Mar 13 15:15:46 crc kubenswrapper[4898]: I0313 15:15:46.038986 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerStarted","Data":"aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea"} Mar 13 15:15:46 crc kubenswrapper[4898]: I0313 15:15:46.351740 4898 scope.go:117] "RemoveContainer" containerID="64e5f9a40c7865406038ca1466eaa2acb4b0149c1cbab8a03ef694cc78da5ccc" Mar 13 15:15:47 crc kubenswrapper[4898]: I0313 15:15:47.101833 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.087835 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.088430 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.624574 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:50 crc kubenswrapper[4898]: I0313 15:15:50.628820 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434131 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434486 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434158 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:51 crc kubenswrapper[4898]: I0313 15:15:51.434559 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:52 crc kubenswrapper[4898]: I0313 15:15:52.169838 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:15:52 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:15:52 crc kubenswrapper[4898]: > Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486105 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486619 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486174 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:15:56 crc kubenswrapper[4898]: I0313 15:15:56.486767 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:57 crc kubenswrapper[4898]: I0313 15:15:57.098656 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:15:59 crc kubenswrapper[4898]: I0313 15:15:59.831288 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:15:59 crc kubenswrapper[4898]: I0313 15:15:59.832001 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.123133 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.123220 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252086 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252123 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252226 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.252247 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.353881 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.354250 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.353924 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.354385 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.498135 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v9lxv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.498197 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692188 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692191 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692258 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692322 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692228 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.692441 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.831572 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:00 crc kubenswrapper[4898]: I0313 15:16:00.831689 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404646 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404731 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404646 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.404816 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.433994 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434074 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434009 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434210 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434230 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434284 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434321 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.434277 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.834479 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.859014 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:01 crc kubenswrapper[4898]: I0313 15:16:01.859072 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.255462 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.255876 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.339721 4898 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-m8j8d container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.339792 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" podUID="a9193e72-6911-4df4-8b26-04b2537f68a9" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.740956 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.741394 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.741075 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:02 crc kubenswrapper[4898]: I0313 15:16:02.741501 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.532501 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.533635 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602295 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602344 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602478 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602481 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602518 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.602976 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.605294 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:03 crc kubenswrapper[4898]: > Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.622186 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.622265 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.830687 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:03 crc kubenswrapper[4898]: I0313 15:16:03.830712 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.002742 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.004040 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.285069 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.285195 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podUID="3c955ebc-98fd-4921-9923-6151a50e8eec" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.361927 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:04 crc kubenswrapper[4898]: > Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.367130 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.367189 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podUID="fb7b2f97-fca8-41d2-9be7-d40fac94c171" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.368061 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:04 crc kubenswrapper[4898]: > Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.449091 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.449108 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5fb555ff84-j52b8 container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.449195 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podUID="2cd05b5b-32da-4560-a761-72221b99e2c6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.534093 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.534180 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617117 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617335 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podUID="fb7b2f97-fca8-41d2-9be7-d40fac94c171" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617374 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5fb555ff84-j52b8 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617400 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podUID="2cd05b5b-32da-4560-a761-72221b99e2c6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.617438 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podUID="3c955ebc-98fd-4921-9923-6151a50e8eec" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.700211 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.831361 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podUID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerName="ovnkube-controller" probeResult="failure" output="command timed out" Mar 13 15:16:04 crc kubenswrapper[4898]: I0313 15:16:04.907597 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031163 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031171 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031384 4898 patch_prober.go:28] interesting pod/console-699d95d586-ds75f container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031530 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031955 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-699d95d586-ds75f" podUID="ab8664f8-1960-4442-9fdd-9711ec963e1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031492 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031574 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podUID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031653 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031683 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.031550 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podUID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.195154 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.195243 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.360230 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.360240 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.401142 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.524131 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.524313 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.524366 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podUID="41000ce4-1a84-44de-b283-1fe0350b1c17" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607171 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607181 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607211 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607432 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607632 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607751 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607867 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607943 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.607991 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.608602 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:05 crc kubenswrapper[4898]: I0313 15:16:05.608332 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:05.692089 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podUID="7bae49ab-1146-43a2-b436-69838c923f1a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:05.692336 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podUID="7bae49ab-1146-43a2-b436-69838c923f1a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:05.843204 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" podUID="e000d86e-e7a8-49ed-9184-fdd67dfe797d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.138098 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podUID="c35de09d-7f21-47d3-aac5-a26b15b0a496" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.245178 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327157 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327202 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327251 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327253 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327297 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327152 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327315 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327358 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327374 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327472 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327560 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327556 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.327625 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403671 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403722 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403776 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.403789 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.609409 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podUID="41000ce4-1a84-44de-b283-1fe0350b1c17" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.834338 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.850220 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:06 crc kubenswrapper[4898]: I0313 15:16:06.850308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858609 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858655 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858713 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.87:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:06.858727 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.031119 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podUID="3a26728d-85c2-465c-bce4-c74045ea9e0d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.263048 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.263132 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.263164 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387080 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387114 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387156 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387124 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.387095 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.455825 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-bqmxg" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.520781 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809"} pod="metallb-system/frr-k8s-bqmxg" containerMessage="Container frr failed liveness probe, will be restarted" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.522214 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" containerID="cri-o://bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809" gracePeriod=2 Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.607049 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-vvj56 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.607098 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podUID="510657b4-32e2-4fa5-9c09-17869a295736" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.611975 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-rd22p" podUID="41000ce4-1a84-44de-b283-1fe0350b1c17" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.738309 4898 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qr6bw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.738420 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podUID="5e81d88f-c63b-4f0c-ba17-f1171350c28d" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.808842 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-mwqzz container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.808912 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podUID="e519fed6-a687-4a01-a979-598e81122ad1" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:07 crc kubenswrapper[4898]: I0313 15:16:07.830745 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-fpgr7" podUID="e4761153-ed4e-4264-8f21-b4de31a4bbb8" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.531677 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.532062 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.531742 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.532427 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.607574 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-vvj56 container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.607724 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podUID="510657b4-32e2-4fa5-9c09-17869a295736" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.621864 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.621951 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.622227 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.622313 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.726030 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.726043 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.738515 4898 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qr6bw container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.738570 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podUID="5e81d88f-c63b-4f0c-ba17-f1171350c28d" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.808257 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-mwqzz container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.808351 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podUID="e519fed6-a687-4a01-a979-598e81122ad1" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835110 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835390 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podUID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerName="sbdb" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835443 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.835479 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-l82wk" podUID="4a0b9ad6-156f-418b-8eae-1d762f8161dd" containerName="nbdb" probeResult="failure" output="command timed out" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.942966 4898 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.943046 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="6a1df267-1145-4fe1-9455-57df3d043e3a" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.992858 4898 trace.go:236] Trace[1350076615]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/redhat-operators-dnh48" (13-Mar-2026 15:16:07.273) (total time: 1713ms): Mar 13 15:16:08 crc kubenswrapper[4898]: Trace[1350076615]: [1.71358368s] [1.71358368s] END Mar 13 15:16:08 crc kubenswrapper[4898]: I0313 15:16:08.993819 4898 trace.go:236] Trace[1179237134]: "Calculate volume metrics of catalog-content for pod openshift-marketplace/community-operators-jndb5" (13-Mar-2026 15:16:07.266) (total time: 1711ms): Mar 13 15:16:08 crc kubenswrapper[4898]: Trace[1179237134]: [1.711972973s] [1.711972973s] END Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.003122 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.003141 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.033294 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.033365 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.089419 4898 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.089503 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="9c5fee8d-2246-4e34-8ddd-ce710e155d73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197034 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197203 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197084 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.197512 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.345346 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.345440 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.404118 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.404374 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532338 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532402 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532486 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.532507 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622347 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622450 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622358 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.622554 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.830994 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:09 crc kubenswrapper[4898]: I0313 15:16:09.831108 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:09.986697 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerDied","Data":"bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809"} Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:09.989514 4898 generic.go:334] "Generic (PLEG): container finished" podID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerID="bea18a0b4bb5bbad650ae6e8e67ba0820f4376de6521e05324b889bd9cd86809" exitCode=143 Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.082663 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.082728 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.083845 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.083907 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152062 4898 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-g559r container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152138 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" podUID="1a01ab05-7178-48c7-892b-b91cf60432f8" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.22:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152614 4898 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-g559r container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.152671 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g559r" podUID="1a01ab05-7178-48c7-892b-b91cf60432f8" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.22:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243149 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243179 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243217 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.243241 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.382101 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.382496 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.453703 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v9lxv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.453781 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.570126 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podUID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693501 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693547 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693575 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693590 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693631 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693565 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693553 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693693 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693743 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.693758 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.726102 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.846576 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.846762 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.847268 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:10 crc kubenswrapper[4898]: I0313 15:16:10.850343 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182135 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182147 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182473 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.182521 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403829 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403888 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403958 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.403972 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433430 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433627 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433685 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433627 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433731 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433634 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433538 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.433948 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.459727 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.459797 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.477214 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.489820 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" containerID="cri-o://6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085" gracePeriod=30 Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.768096 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.768458 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809066 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809131 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809181 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.809250 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.825808 4898 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.825406021s: [/var/lib/containers/storage/overlay/a1b358f0312b11061f478e8cafc04d37350c0ce47e98958a390f7017efbcafab/diff /var/log/pods/openstack_openstackclient_124bd4ee-d9f0-408f-a46e-4d143e8ab02a/openstackclient/0.log]; will not log again for this container unless duration exceeds 2s Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.835485 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.858227 4898 patch_prober.go:28] interesting pod/thanos-querier-7467c7fcf7-hsxhp container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:11 crc kubenswrapper[4898]: I0313 15:16:11.858299 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-7467c7fcf7-hsxhp" podUID="05b901e7-b9fc-4403-bcc2-8eeb2731c66f" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.87:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.212043 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.253156 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" podUID="7c1fa9c0-bb2e-4806-95fd-07fba426bdc8" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.253288 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.340464 4898 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-m8j8d container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.340754 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-m8j8d" podUID="a9193e72-6911-4df4-8b26-04b2537f68a9" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.70:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.461245 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.461306 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.668067 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cx9pw" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743397 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743558 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743409 4898 patch_prober.go:28] interesting pod/oauth-openshift-6757584b5b-nct75 container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.743683 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6757584b5b-nct75" podUID="3e855c35-d5e2-4a5a-9e1a-ccae6aebd7d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.68:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.888950 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-fpgr7" podUID="e4761153-ed4e-4264-8f21-b4de31a4bbb8" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.889250 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.889324 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.890859 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 13 15:16:12 crc kubenswrapper[4898]: I0313 15:16:12.890958 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerName="ceilometer-central-agent" containerID="cri-o://7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3" gracePeriod=30 Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.531756 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.531831 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.621307 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.621571 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.832020 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.832170 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:13 crc kubenswrapper[4898]: I0313 15:16:13.832716 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 15:16:13 crc kubenswrapper[4898]: E0313 15:16:13.978770 4898 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T15:16:03Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.021531 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.021656 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.021535 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="d555bd54-f4d5-4b06-9517-32b4fe687f4b" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.177:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.099796 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-bqmxg" event={"ID":"1a7fcb96-7168-4049-8c28-d3f740599e48","Type":"ContainerStarted","Data":"dff248ea143392c1ca27792bdf6dcdbd888bce77e921d4a7a8ca6f0d73c34304"} Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.158075 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-4n5rx" podUID="3c955ebc-98fd-4921-9923-6151a50e8eec" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.158075 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-p9d5v" podUID="0d88a5d2-a852-409e-b4bd-939d1c2b9090" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.200096 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-d47688694-gtlps" podUID="45efd8ce-26db-4511-bd88-2e7467d02bbb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.200154 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-mf8h6" podUID="fb7b2f97-fca8-41d2-9be7-d40fac94c171" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.241142 4898 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5fb555ff84-j52b8 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.241402 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5fb555ff84-j52b8" podUID="2cd05b5b-32da-4560-a761-72221b99e2c6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.283363 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-tqp4b" podUID="ea0ad033-9a48-4e42-a237-f27cacf03adc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.338116 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.385869 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.502206 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc" podUID="ba56f415-73d5-4301-a25d-0e5d1ba4e3b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.543217 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" podUID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.584115 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-ntlw6" podUID="d71982c0-a3d0-4da8-84cd-7494301f589f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709100 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-7f84474648-mr4wv" podUID="52959483-daae-423a-a3bf-8e3fa7810074" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709449 4898 patch_prober.go:28] interesting pod/console-699d95d586-ds75f container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709484 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-s5zh6" podUID="d24bb749-0b71-456b-80e4-fdf6dd23ba30" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709522 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-z2gd2" podUID="1df4a7d6-b0c2-4b00-b591-1a612bd319b6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.709518 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-699d95d586-ds75f" podUID="ab8664f8-1960-4442-9fdd-9711ec963e1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.770455 4898 trace.go:236] Trace[1749974672]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (13-Mar-2026 15:16:08.509) (total time: 6252ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[1749974672]: [6.252274294s] [6.252274294s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.770489 4898 trace.go:236] Trace[718665416]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (13-Mar-2026 15:16:11.572) (total time: 3192ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[718665416]: [3.192789928s] [3.192789928s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.773271 4898 trace.go:236] Trace[1920003612]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (13-Mar-2026 15:16:11.841) (total time: 2924ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[1920003612]: [2.924072815s] [2.924072815s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.773920 4898 trace.go:236] Trace[575714101]: "Calculate volume metrics of glance for pod openstack/glance-default-internal-api-0" (13-Mar-2026 15:16:10.652) (total time: 4108ms): Mar 13 15:16:14 crc kubenswrapper[4898]: Trace[575714101]: [4.108760465s] [4.108760465s] END Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.912947 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="b7452a36-0169-4cfe-9ede-ef4d0ef072d9" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.23:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:14 crc kubenswrapper[4898]: I0313 15:16:14.913275 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="b7452a36-0169-4cfe-9ede-ef4d0ef072d9" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.23:8080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.059201 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-njsvh" podUID="0d7c657b-a701-41fe-9b23-d5bba3302c4f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.100169 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" podUID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.100181 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-f2t6t" podUID="66a86c31-9ff3-439a-a0f8-96c981014b6f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.142126 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-wdmrh" podUID="da3795a7-363f-4637-afe2-77cb77248f9a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.142146 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-5b9fbd87f-s2k96" podUID="9ff6f89a-7110-42fb-96b9-8611f280bebe" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.224175 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-jwrd2" podUID="919747b8-a031-4654-999f-3c3928f981b4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.224660 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" podUID="19a0f4de-5258-4f2b-9587-71293459378e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.308088 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.308166 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.309230 4898 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-7zpw2 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.309276 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7zpw2" podUID="9c7e70de-de85-421c-aaeb-476450d8e0ee" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.461385 4898 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-7zcdz container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.461450 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" podUID="ba480ebb-f079-4888-857b-d917e4a9b13b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.462609 4898 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-7zcdz container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.462641 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-7zcdz" podUID="ba480ebb-f079-4888-857b-d917e4a9b13b" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.73:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.489969 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.490030 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.647152 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6b8c6b5df9-kk2gn" podUID="7bae49ab-1146-43a2-b436-69838c923f1a" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836173 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836201 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836206 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836235 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fcfmz" podUID="2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836339 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836458 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-hkbng" podUID="89abe4ad-dd62-4a70-a1d1-fdf97448ada5" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.836726 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-zs42q" podUID="0182307e-bc7f-415e-a0f9-0eff9902384c" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:15 crc kubenswrapper[4898]: I0313 15:16:15.843067 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" podUID="e000d86e-e7a8-49ed-9184-fdd67dfe797d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.057466 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-bqmxg" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.122263 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerDied","Data":"6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085"} Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.122678 4898 generic.go:334] "Generic (PLEG): container finished" podID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerID="6ce16b962cdf6095fe075410dd0e3e4b0df623a44b72871189f0b1d5d8146085" exitCode=0 Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.178142 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podUID="c35de09d-7f21-47d3-aac5-a26b15b0a496" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.219090 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.262310 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-67c6f6c5cb-d26qw" podUID="34b4f98c-a87c-4a97-9ac4-286afeb9e4bc" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.262713 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-8kcsw" podUID="c35de09d-7f21-47d3-aac5-a26b15b0a496" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.263194 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.263268 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.270456 4898 patch_prober.go:28] interesting pod/controller-manager-797d9c85c-m5jdj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.270548 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-797d9c85c-m5jdj" podUID="2ec8c09e-475e-4c4b-86ec-38388754240f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.72:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.316758 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.317308 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.316798 4898 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4p7wt container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.317554 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4p7wt" podUID="802396a8-633d-4f86-b77b-c25e9c76cc7a" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.82:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405036 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405100 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405147 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405140 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.405204 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.406564 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e"} pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.406617 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" containerID="cri-o://0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e" gracePeriod=30 Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.849860 4898 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:16 crc kubenswrapper[4898]: I0313 15:16:16.849998 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.072067 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podUID="3a26728d-85c2-465c-bce4-c74045ea9e0d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.113046 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.195099 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-5f7dc44db6-9nsrh" podUID="3a26728d-85c2-465c-bce4-c74045ea9e0d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.195190 4898 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.277118 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.277140 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.359543 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-5p4w5" podUID="604b9c21-3e85-4c2e-9faf-962f44236911" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.359631 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.359994 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-7bb4cc7c98-cx422" podUID="b231c7db-5056-4ec6-a64c-0aa8bdff336b" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.99:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.607577 4898 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-vvj56 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.607633 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-vvj56" podUID="510657b4-32e2-4fa5-9c09-17869a295736" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.738235 4898 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-qr6bw container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.738339 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-qr6bw" podUID="5e81d88f-c63b-4f0c-ba17-f1171350c28d" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.807273 4898 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-mwqzz container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.807342 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-mwqzz" podUID="e519fed6-a687-4a01-a979-598e81122ad1" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.830919 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.834228 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.835998 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:17 crc kubenswrapper[4898]: I0313 15:16:17.838299 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-zgjzn" podUID="cbb51f06-0778-4b18-82b5-c5ce91e0a613" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.532615 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.532664 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-9qh4r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.535232 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.535106 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-9qh4r" podUID="077fcbe8-c497-44b4-82f9-ff8e317cbe83" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.622439 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.623334 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.623402 4898 patch_prober.go:28] interesting pod/logging-loki-gateway-c6d797ccf-8ng9x container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.623417 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-c6d797ccf-8ng9x" podUID="13ee53e6-2549-4dd8-91ac-80e4ef2c9d99" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.726231 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.726285 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-g5gqr" podUID="edfd91ee-1246-43b2-84a0-95ea069de402" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.942978 4898 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:18 crc kubenswrapper[4898]: I0313 15:16:18.943102 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="6a1df267-1145-4fe1-9455-57df3d043e3a" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.033070 4898 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.033115 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="2194d847-4858-4f46-ab8b-c2d78cf5677e" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": context deadline exceeded" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.090298 4898 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.090357 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="9c5fee8d-2246-4e34-8ddd-ce710e155d73" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.133951 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.134017 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.168023 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" event={"ID":"7667c5a1-aecb-4ccd-b8fd-e20c2c049472","Type":"ContainerStarted","Data":"88aa7f357e8dd41207b3d690f070bb48ea1ac7bd413234067776040026c3e1df"} Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385156 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385393 4898 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-z7ng7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385410 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.385468 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-z7ng7" podUID="b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.75:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.830708 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.830840 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.833891 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:19 crc kubenswrapper[4898]: I0313 15:16:19.833976 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.085075 4898 patch_prober.go:28] interesting pod/metrics-server-7b77fdd7dd-vwwfr container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.085136 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.89:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.085207 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.094147 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6"} pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.094203 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" podUID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerName="metrics-server" containerID="cri-o://7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6" gracePeriod=170 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.178201 4898 generic.go:334] "Generic (PLEG): container finished" podID="19a0f4de-5258-4f2b-9587-71293459378e" containerID="490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5" exitCode=1 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.178589 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerDied","Data":"490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5"} Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.181135 4898 generic.go:334] "Generic (PLEG): container finished" podID="32b5ebfd-38d9-456e-bb21-7332323239d1" containerID="cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543" exitCode=1 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.181298 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerDied","Data":"cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543"} Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.181875 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.185501 4898 scope.go:117] "RemoveContainer" containerID="490b131fcdcfc3c4a0f9eaefa6f19529f359253225ede8f7a9b1add8a23964b5" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.186561 4898 scope.go:117] "RemoveContainer" containerID="cf80e2daa36139cd410b53ece19d163d97bf6b041aba2184079936d40fe56543" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243119 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243221 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243268 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243497 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243542 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243711 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243769 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.243964 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.245189 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d"} pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" containerMessage="Container operator failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.245279 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" containerID="cri-o://00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d" gracePeriod=30 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424203 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424269 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424321 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424334 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.424390 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433097 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433296 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433304 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433346 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.433407 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.452863 4898 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v9lxv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.452976 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.453042 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.465554 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba"} pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.465644 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" podUID="6f12557e-02f5-4445-988f-b19f16672e3b" containerName="authentication-operator" containerID="cri-o://04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba" gracePeriod=30 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.483720 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.483745 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493095 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54530->192.168.126.11:10257: read: connection reset by peer" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493116 4898 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54520->192.168.126.11:10257: read: connection reset by peer" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493131 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54530->192.168.126.11:10257: read: connection reset by peer" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.493165 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:54520->192.168.126.11:10257: read: connection reset by peer" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.611096 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podUID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.611171 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" podUID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.695963 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696120 4898 patch_prober.go:28] interesting pod/monitoring-plugin-595dc77696-pft4c container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696300 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696346 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696374 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" podUID="10c7ab08-2341-4e85-ad67-8495e038afa2" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.90:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696141 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696440 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696156 4898 patch_prober.go:28] interesting pod/console-operator-58897d9998-kgnxj container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696468 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-kgnxj" podUID="eedd2260-f339-4e2f-83e8-13a56cee2ce6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696169 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696505 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.696560 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.698135 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b"} pod="openshift-ingress/router-default-5444994796-6plhg" containerMessage="Container router failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.698207 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" containerID="cri-o://c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b" gracePeriod=10 Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.830980 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831103 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831167 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831512 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.831580 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.832563 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.838094 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:20 crc kubenswrapper[4898]: I0313 15:16:20.838254 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-9k7p6" podUID="478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8" containerName="registry-server" probeResult="failure" output="command timed out" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182076 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182375 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182123 4898 patch_prober.go:28] interesting pod/downloads-7954f5f757-cx59b container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.182435 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-cx59b" podUID="f4f26c0f-992a-4eb4-86d2-58e42a5b2b68" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.196778 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.200083 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.202936 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.205104 4898 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471" exitCode=1 Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.205208 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471"} Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.205313 4898 scope.go:117] "RemoveContainer" containerID="b65cd02a017de53599582fdc93495c1971ff933ecacdf6af0171bad6070bff66" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.206072 4898 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-x85vd container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.206109 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" podUID="7667c5a1-aecb-4ccd-b8fd-e20c2c049472" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": dial tcp 10.217.0.29:5443: connect: connection refused" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.208572 4898 scope.go:117] "RemoveContainer" containerID="f3f7856b47908c691e419ef35da06276497b17de1d07f34af58226c123a50471" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.412481 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.412537 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.412599 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.417570 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.417633 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.417682 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.466888 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467193 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467272 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.466982 4898 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-nkt76 container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467816 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" podUID="79ead8ee-67ba-4831-b5d4-a1f128e94334" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.28:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467026 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467854 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.467877 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.474002 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="olm-operator" containerStatusID={"Type":"cri-o","ID":"533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec"} pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" containerMessage="Container olm-operator failed liveness probe, will be restarted" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.474084 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" containerID="cri-o://533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec" gracePeriod=30 Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.804859 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.804988 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.805304 4898 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-x4nxn container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.805362 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-x4nxn" podUID="8675f0f0-7d3b-41d9-959e-e73f78f32c5c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.834583 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:16:21 crc kubenswrapper[4898]: I0313 15:16:21.845599 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1b691eb6-70f2-4fce-b18a-1d7712fddcac" containerName="prometheus" probeResult="failure" output="command timed out" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.098082 4898 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-bqmxg" podUID="1a7fcb96-7168-4049-8c28-d3f740599e48" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.220567 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.221715 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.226023 4898 generic.go:334] "Generic (PLEG): container finished" podID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerID="00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d" exitCode=0 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.226109 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerDied","Data":"00f83bd17f1c6c8f7be365f8c10dd0d1fa4e2714540181da78b69f463649b54d"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.229650 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" event={"ID":"19a0f4de-5258-4f2b-9587-71293459378e","Type":"ContainerStarted","Data":"a21c77ffcf35fcf200cb633fb54eeac6d8564a2b9803bb1375581cf30c3e1d15"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.229883 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.231508 4898 generic.go:334] "Generic (PLEG): container finished" podID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerID="0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e" exitCode=0 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.231564 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerDied","Data":"0292f4db3076b00a4ff8a067c3c4184374170598a35c90418442c7305e929d8e"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.235586 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" event={"ID":"32b5ebfd-38d9-456e-bb21-7332323239d1","Type":"ContainerStarted","Data":"b1a0ce8e62f99617a982cb589ae1a57ff6616fa5fa8223b227e7b7590e8cd06d"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.236157 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.242806 4898 generic.go:334] "Generic (PLEG): container finished" podID="e000d86e-e7a8-49ed-9184-fdd67dfe797d" containerID="b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f" exitCode=1 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.243010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerDied","Data":"b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f"} Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.244359 4898 scope.go:117] "RemoveContainer" containerID="b90a231f04c7fa997bdb6c4af06b79391dc9d6f78d2d6a60c3d05ec12274e55f" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.244591 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.244650 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" containerID="cri-o://f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64" gracePeriod=30 Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.266317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.296719 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.308818 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.425580 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"auto-csr-approver-29556916-dvhfq\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.471041 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.482995 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.483419 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.512403 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.512466 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.529871 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"auto-csr-approver-29556916-dvhfq\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.551323 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.809331 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"auto-csr-approver-29556916-dvhfq\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:22 crc kubenswrapper[4898]: I0313 15:16:22.832804 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.174486 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208323 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208323 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.208999 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" podUID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": dial tcp 10.217.0.110:8081: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.285585 4898 generic.go:334] "Generic (PLEG): container finished" podID="0ab852e1-fd26-4f76-b758-77896f8e236b" containerID="5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69" exitCode=1 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.285694 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerDied","Data":"5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.288043 4898 scope.go:117] "RemoveContainer" containerID="5517ef51ccfa8b2748099ff5aaad46312299246fde947335af38f7f6b1623c69" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.294859 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" event={"ID":"0376a3d3-f3a2-4674-a7f9-b06a9e62836e","Type":"ContainerStarted","Data":"fe9cb4f935782030682843d4f6ff1e103b91fd35cda27d7fe8cbd8f803b88c10"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.295317 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.295477 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.295518 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.297287 4898 generic.go:334] "Generic (PLEG): container finished" podID="d29ce3ee-3d5a-4801-abf9-dfef5b641a74" containerID="f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d" exitCode=1 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.297333 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerDied","Data":"f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.298383 4898 scope.go:117] "RemoveContainer" containerID="f59d39668fd5f01a9f140fcfa1fd92ce8c2ea3f200013f1b1e08b5b0e679790d" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.304492 4898 generic.go:334] "Generic (PLEG): container finished" podID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerID="f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.304550 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerDied","Data":"f6246f6fbdbf725301f1e7d37222d66624c9ce34cacd75b24a3bb142ed798d64"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.311164 4898 generic.go:334] "Generic (PLEG): container finished" podID="6f12557e-02f5-4445-988f-b19f16672e3b" containerID="04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.311245 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerDied","Data":"04b36de235cccfc39b10d95f29d1b8eb5397d41b37472245250afa44100523ba"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.320711 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/2.log" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.322782 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.323607 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2f1658da3ae2f145888c3de5fc2918adc223e0f1999ca74ae5bb585f272622e4"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.330601 4898 generic.go:334] "Generic (PLEG): container finished" podID="02f7d483-aecb-4a39-babc-6d9598090c4b" containerID="7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.330660 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerDied","Data":"7ce3438fd9d4e3db5da0a65bc2744dcbc537f4c6c98b27b1433e8ed964fd1ed3"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.334963 4898 generic.go:334] "Generic (PLEG): container finished" podID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerID="533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec" exitCode=0 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.335025 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerDied","Data":"533f9c0d5d010c2b5018f71aab66fa3025d2e9a7d36d33fe0f8634b0b2bbb7ec"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.338023 4898 generic.go:334] "Generic (PLEG): container finished" podID="a80d01d5-0201-4b2e-974c-ac5b42ac8df4" containerID="d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4" exitCode=1 Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.338579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerDied","Data":"d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4"} Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.339552 4898 scope.go:117] "RemoveContainer" containerID="d2bed079a6a5a2e515d93d89a9a982d254f51045181d7bcb6277083a331b61e4" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.508265 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.881459 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:23 crc kubenswrapper[4898]: I0313 15:16:23.881793 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.170579 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:24 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:24 crc kubenswrapper[4898]: > Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.351378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" event={"ID":"a80d01d5-0201-4b2e-974c-ac5b42ac8df4","Type":"ContainerStarted","Data":"57dd0f00abea15d0102c1f6686fe41b17154ca27e77edac4d3f07ccf4e0e9a30"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.351677 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.358673 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" event={"ID":"7d27543e-df10-41f7-be85-dfe319aaec8a","Type":"ContainerStarted","Data":"97f9af41b669994e028cd2913281b09704a6416bf06091e5bbc3ac6ac3674dc6"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.359421 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.361485 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.361560 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.373365 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" event={"ID":"e000d86e-e7a8-49ed-9184-fdd67dfe797d","Type":"ContainerStarted","Data":"2cf4d6c27dfc0f01d733dd2ec60bf38c3ede882f93d64c0e31fc8677931d4396"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.374195 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.377384 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v9lxv" event={"ID":"6f12557e-02f5-4445-988f-b19f16672e3b","Type":"ContainerStarted","Data":"b31d17031b09e08e43b60bd3c1e2de112528a6f18a6b414ec43a7e632a1eb2a8"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381008 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" event={"ID":"c8b0b1cf-022c-4181-a957-2f7e172a3294","Type":"ContainerStarted","Data":"e071b990987923ab86abcea4120d15a5572916972d1bc0013951d7f1eb0b37a1"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381370 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381558 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.381603 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.383043 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" event={"ID":"0ab852e1-fd26-4f76-b758-77896f8e236b","Type":"ContainerStarted","Data":"7b52a84dc05610996a0e88a7808021878a6fff6c922305317623864c3cc9491b"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.383873 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.390195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" event={"ID":"d29ce3ee-3d5a-4801-abf9-dfef5b641a74","Type":"ContainerStarted","Data":"068eab642ca7d18702fbe49598e8364e743f264af5b01ef0310ed4c5d5f66912"} Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.390753 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.390798 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.603146 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" containerID="cri-o://5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1" gracePeriod=27 Mar 13 15:16:24 crc kubenswrapper[4898]: I0313 15:16:24.617280 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" containerID="cri-o://754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" gracePeriod=26 Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.084055 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" event={"ID":"3bfc0332-bb59-42bf-bb70-462efa225c81","Type":"ContainerStarted","Data":"d02edb2f50df8c11d1dd354b0da5f4dcf8478960efc66d23360e9ccc032a7106"} Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400756 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400888 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.400960 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.402836 4898 patch_prober.go:28] interesting pod/route-controller-manager-7b756f97f-wjsf2 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.402880 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" podUID="0376a3d3-f3a2-4674-a7f9-b06a9e62836e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.74:8443/healthz\": dial tcp 10.217.0.74:8443: connect: connection refused" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.404101 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"02f7d483-aecb-4a39-babc-6d9598090c4b","Type":"ContainerStarted","Data":"b895aa795c06776bb698b15151bf2914299b97db6c949697d18679898bb8b085"} Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.404591 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405644 4898 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-k6lrz container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405656 4898 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qtdtw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405697 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" podUID="c8b0b1cf-022c-4181-a957-2f7e172a3294" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 13 15:16:25 crc kubenswrapper[4898]: I0313 15:16:25.405697 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" podUID="7d27543e-df10-41f7-be85-dfe319aaec8a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Mar 13 15:16:26 crc kubenswrapper[4898]: I0313 15:16:26.421474 4898 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-ljrtz container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" start-of-body= Mar 13 15:16:26 crc kubenswrapper[4898]: I0313 15:16:26.422317 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" podUID="3bfc0332-bb59-42bf-bb70-462efa225c81" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.16:8081/healthz\": dial tcp 10.217.0.16:8081: connect: connection refused" Mar 13 15:16:26 crc kubenswrapper[4898]: I0313 15:16:26.627631 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-bqmxg" Mar 13 15:16:27 crc kubenswrapper[4898]: I0313 15:16:27.435565 4898 generic.go:334] "Generic (PLEG): container finished" podID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerID="5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1" exitCode=0 Mar 13 15:16:27 crc kubenswrapper[4898]: I0313 15:16:27.435681 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerDied","Data":"5603fe1e36b001c885a705c95d9e330d44a889a46589d0677416ea907bf21af1"} Mar 13 15:16:27 crc kubenswrapper[4898]: I0313 15:16:27.788698 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.347701 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.462730 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f","Type":"ContainerStarted","Data":"df908f851dea57055c3789e36a48d0a1ca0c8b55e694ae40d9f47ed21423202a"} Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.474457 4898 generic.go:334] "Generic (PLEG): container finished" podID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" exitCode=0 Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.474503 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerDied","Data":"754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c"} Mar 13 15:16:28 crc kubenswrapper[4898]: I0313 15:16:28.506989 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:16:28 crc kubenswrapper[4898]: W0313 15:16:28.570537 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6b061a_b0db_4b84_bfc7_08238f699132.slice/crio-7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49 WatchSource:0}: Error finding container 7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49: Status 404 returned error can't find the container with id 7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49 Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.618977 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.620357 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.620724 4898 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" containerID="754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 13 15:16:28 crc kubenswrapper[4898]: E0313 15:16:28.620784 4898 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 754d039d0115d17f3272d5d60732b4f70ed1599a25b6aa63fdbeafe8a92c023c is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e5d53cf3-113e-4391-b3a9-4e1f81836e26" containerName="galera" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.231227 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-ljrtz" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.231948 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.342505 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-nkt76" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.451926 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.494186 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerStarted","Data":"7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49"} Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.511406 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e5d53cf3-113e-4391-b3a9-4e1f81836e26","Type":"ContainerStarted","Data":"4d25705f3ea5640e3bd4269670bc56e2661a3c0a18a96ecdd41b77899dbed12d"} Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.555116 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.624313 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Mar 13 15:16:29 crc kubenswrapper[4898]: [+]has-synced ok Mar 13 15:16:29 crc kubenswrapper[4898]: [-]process-running failed: reason withheld Mar 13 15:16:29 crc kubenswrapper[4898]: healthz check failed Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.624374 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.668176 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-595dc77696-pft4c" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.846521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:29 crc kubenswrapper[4898]: I0313 15:16:29.846580 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.409294 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qtdtw" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.436962 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-k6lrz" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.445401 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-x85vd" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.524425 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerStarted","Data":"bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0"} Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.528440 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerID="aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea" exitCode=0 Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.528524 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea"} Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.550165 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" podStartSLOduration=21.695304056 podStartE2EDuration="22.550145652s" podCreationTimestamp="2026-03-13 15:16:08 +0000 UTC" firstStartedPulling="2026-03-13 15:16:28.579979315 +0000 UTC m=+4823.581567554" lastFinishedPulling="2026-03-13 15:16:29.434820921 +0000 UTC m=+4824.436409150" observedRunningTime="2026-03-13 15:16:30.542332455 +0000 UTC m=+4825.543920704" watchObservedRunningTime="2026-03-13 15:16:30.550145652 +0000 UTC m=+4825.551733891" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.623265 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.623400 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.624686 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 13 15:16:30 crc kubenswrapper[4898]: I0313 15:16:30.624749 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" containerID="cri-o://6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d" gracePeriod=30 Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.232079 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:31 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:31 crc kubenswrapper[4898]: > Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561056 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-6plhg_18e5c8bf-9fe0-465e-af8f-9e7ec7400be8/router/0.log" Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561426 4898 generic.go:334] "Generic (PLEG): container finished" podID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerID="c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b" exitCode=137 Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561528 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerDied","Data":"c62d0b0db8a023eca0369719b9dd81ab2eadb339e637fa7223f0ac36593bb07b"} Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.561624 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6plhg" event={"ID":"18e5c8bf-9fe0-465e-af8f-9e7ec7400be8","Type":"ContainerStarted","Data":"11acc6c7309a70dd02a02a8791913b33319c75a2620dd25f8f262fb055feaf2d"} Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.610710 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.612240 4898 patch_prober.go:28] interesting pod/router-default-5444994796-6plhg container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 13 15:16:31 crc kubenswrapper[4898]: I0313 15:16:31.612293 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6plhg" podUID="18e5c8bf-9fe0-465e-af8f-9e7ec7400be8" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 13 15:16:32 crc kubenswrapper[4898]: I0313 15:16:32.574754 4898 generic.go:334] "Generic (PLEG): container finished" podID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerID="bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0" exitCode=0 Mar 13 15:16:32 crc kubenswrapper[4898]: I0313 15:16:32.574826 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerDied","Data":"bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0"} Mar 13 15:16:32 crc kubenswrapper[4898]: I0313 15:16:32.616659 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.210015 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-jngrl" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.500235 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-v99bm" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.617841 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerStarted","Data":"e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde"} Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.618635 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.623131 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6plhg" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.647148 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dnh48" podStartSLOduration=6.05816232 podStartE2EDuration="1m11.647128771s" podCreationTimestamp="2026-03-13 15:15:22 +0000 UTC" firstStartedPulling="2026-03-13 15:15:26.668082389 +0000 UTC m=+4761.669670628" lastFinishedPulling="2026-03-13 15:16:32.25704884 +0000 UTC m=+4827.258637079" observedRunningTime="2026-03-13 15:16:33.64178903 +0000 UTC m=+4828.643377289" watchObservedRunningTime="2026-03-13 15:16:33.647128771 +0000 UTC m=+4828.648717010" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.882753 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.882865 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:16:33 crc kubenswrapper[4898]: I0313 15:16:33.885110 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-s2rdh" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.029000 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-smdkt" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.490540 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.637803 4898 generic.go:334] "Generic (PLEG): container finished" podID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerID="6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d" exitCode=0 Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.637921 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerDied","Data":"6b88f7ff52506e35b8cbb7e231ccb1285ae276c1a9d69c02b14ca5282f74157d"} Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.640206 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" event={"ID":"bb6b061a-b0db-4b84-bfc7-08238f699132","Type":"ContainerDied","Data":"7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49"} Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.640399 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556916-dvhfq" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.640770 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7309bc6a6f15fd15c242b030d405b5478267924b192042e934072170e618ae49" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.658974 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") pod \"bb6b061a-b0db-4b84-bfc7-08238f699132\" (UID: \"bb6b061a-b0db-4b84-bfc7-08238f699132\") " Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.684306 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895" (OuterVolumeSpecName: "kube-api-access-d6895") pod "bb6b061a-b0db-4b84-bfc7-08238f699132" (UID: "bb6b061a-b0db-4b84-bfc7-08238f699132"). InnerVolumeSpecName "kube-api-access-d6895". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.767320 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6895\" (UniqueName: \"kubernetes.io/projected/bb6b061a-b0db-4b84-bfc7-08238f699132-kube-api-access-d6895\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:34 crc kubenswrapper[4898]: I0313 15:16:34.959546 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:34 crc kubenswrapper[4898]: > Mar 13 15:16:35 crc kubenswrapper[4898]: I0313 15:16:35.408741 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b756f97f-wjsf2" Mar 13 15:16:36 crc kubenswrapper[4898]: I0313 15:16:36.669920 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a9a7064c-4ed5-4948-9e7e-7d40794e371e","Type":"ContainerStarted","Data":"73adc53aef32c0108da19cf6e8c79fa9d99ea5ab3af2ff7e8bfab2d2ff3c28cc"} Mar 13 15:16:38 crc kubenswrapper[4898]: I0313 15:16:38.350872 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 15:16:38 crc kubenswrapper[4898]: I0313 15:16:38.618444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 15:16:38 crc kubenswrapper[4898]: I0313 15:16:38.620155 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 15:16:39 crc kubenswrapper[4898]: I0313 15:16:39.608118 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 15:16:41 crc kubenswrapper[4898]: I0313 15:16:41.127457 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:41 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:41 crc kubenswrapper[4898]: > Mar 13 15:16:44 crc kubenswrapper[4898]: I0313 15:16:44.631763 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:45 crc kubenswrapper[4898]: I0313 15:16:45.372654 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:45 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:45 crc kubenswrapper[4898]: > Mar 13 15:16:47 crc kubenswrapper[4898]: I0313 15:16:47.070583 4898 scope.go:117] "RemoveContainer" containerID="ef58acf8d4ae6204788d3fdc59aecb3462e5ceb48474e990c3e92ca73408a0f8" Mar 13 15:16:47 crc kubenswrapper[4898]: I0313 15:16:47.125632 4898 scope.go:117] "RemoveContainer" containerID="26a0efcf86b49360d3ca0f6db51f8be8241064695e81797fdbbea93b417ae346" Mar 13 15:16:47 crc kubenswrapper[4898]: I0313 15:16:47.213445 4898 scope.go:117] "RemoveContainer" containerID="1131b0202620c67ed5c6fc2a5f10369017ac4f0c291aa0a69276d20706cb8627" Mar 13 15:16:49 crc kubenswrapper[4898]: I0313 15:16:49.133861 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:16:49 crc kubenswrapper[4898]: I0313 15:16:49.134197 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:16:49 crc kubenswrapper[4898]: I0313 15:16:49.627647 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="a9a7064c-4ed5-4948-9e7e-7d40794e371e" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.292053 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.513553 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.754948 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 15:16:50 crc kubenswrapper[4898]: I0313 15:16:50.998059 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 15:16:51 crc kubenswrapper[4898]: I0313 15:16:51.185658 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:51 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:51 crc kubenswrapper[4898]: > Mar 13 15:16:54 crc kubenswrapper[4898]: I0313 15:16:54.680623 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 15:16:54 crc kubenswrapper[4898]: I0313 15:16:54.805650 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-cf7c75c99-qxdbx" Mar 13 15:16:54 crc kubenswrapper[4898]: I0313 15:16:54.954070 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:16:54 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:16:54 crc kubenswrapper[4898]: > Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.077489 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.091814 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556910-lcc5d"] Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.760875 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a36f55a-ce22-4339-967f-906f473ddad5" path="/var/lib/kubelet/pods/0a36f55a-ce22-4339-967f-906f473ddad5/volumes" Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.934728 4898 generic.go:334] "Generic (PLEG): container finished" podID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerID="f6562a9a91d72757d77dbf107969b6ab33c4a3a7219f9b57d0df0ee10184af60" exitCode=1 Mar 13 15:16:55 crc kubenswrapper[4898]: I0313 15:16:55.934932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerDied","Data":"f6562a9a91d72757d77dbf107969b6ab33c4a3a7219f9b57d0df0ee10184af60"} Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.829728 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.853927 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.853984 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854050 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854098 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854329 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854371 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854396 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854454 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.854508 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") pod \"d19e8770-f0c1-491e-96c9-f737386ab3b0\" (UID: \"d19e8770-f0c1-491e-96c9-f737386ab3b0\") " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.858458 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.858660 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data" (OuterVolumeSpecName: "config-data") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.865704 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.874926 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.892681 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45" (OuterVolumeSpecName: "kube-api-access-mqc45") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "kube-api-access-mqc45". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.916947 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.941654 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.963164 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d19e8770-f0c1-491e-96c9-f737386ab3b0","Type":"ContainerDied","Data":"afe41cfa21ca0ff15752a7557715bbecbe54855edbe2061b3b072582d6fab3b3"} Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.963204 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afe41cfa21ca0ff15752a7557715bbecbe54855edbe2061b3b072582d6fab3b3" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.963246 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.994507 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.994538 4898 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.994548 4898 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999373 4898 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999413 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqc45\" (UniqueName: \"kubernetes.io/projected/d19e8770-f0c1-491e-96c9-f737386ab3b0-kube-api-access-mqc45\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999430 4898 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:57 crc kubenswrapper[4898]: I0313 15:16:57.999445 4898 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d19e8770-f0c1-491e-96c9-f737386ab3b0-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.002238 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.021107 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d19e8770-f0c1-491e-96c9-f737386ab3b0" (UID: "d19e8770-f0c1-491e-96c9-f737386ab3b0"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.034075 4898 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.102459 4898 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d19e8770-f0c1-491e-96c9-f737386ab3b0-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.102492 4898 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d19e8770-f0c1-491e-96c9-f737386ab3b0-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 13 15:16:58 crc kubenswrapper[4898]: I0313 15:16:58.102502 4898 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.156423 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.222227 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.377910 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 15:17:00 crc kubenswrapper[4898]: E0313 15:17:00.379839 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerName="tempest-tests-tempest-tests-runner" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.380491 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerName="tempest-tests-tempest-tests-runner" Mar 13 15:17:00 crc kubenswrapper[4898]: E0313 15:17:00.380534 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerName="oc" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.380543 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerName="oc" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.383375 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19e8770-f0c1-491e-96c9-f737386ab3b0" containerName="tempest-tests-tempest-tests-runner" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.383426 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" containerName="oc" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.387821 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.406403 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.448575 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8v5gs" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.461637 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8t4g\" (UniqueName: \"kubernetes.io/projected/382b2b09-8110-411f-9d86-53e73df67fe6-kube-api-access-t8t4g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.462155 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.484860 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.565106 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8t4g\" (UniqueName: \"kubernetes.io/projected/382b2b09-8110-411f-9d86-53e73df67fe6-kube-api-access-t8t4g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.565265 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.567158 4898 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.614034 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.614779 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8t4g\" (UniqueName: \"kubernetes.io/projected/382b2b09-8110-411f-9d86-53e73df67fe6-kube-api-access-t8t4g\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"382b2b09-8110-411f-9d86-53e73df67fe6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:00 crc kubenswrapper[4898]: I0313 15:17:00.756323 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.019459 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jndb5" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" containerID="cri-o://d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" gracePeriod=2 Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.245134 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.905686 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.931225 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") pod \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.931522 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") pod \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.931553 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") pod \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\" (UID: \"2157e8bf-88a5-4e48-b621-1744dcf0fcdb\") " Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.932179 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities" (OuterVolumeSpecName: "utilities") pod "2157e8bf-88a5-4e48-b621-1744dcf0fcdb" (UID: "2157e8bf-88a5-4e48-b621-1744dcf0fcdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:02 crc kubenswrapper[4898]: I0313 15:17:02.938886 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf" (OuterVolumeSpecName: "kube-api-access-9jdgf") pod "2157e8bf-88a5-4e48-b621-1744dcf0fcdb" (UID: "2157e8bf-88a5-4e48-b621-1744dcf0fcdb"). InnerVolumeSpecName "kube-api-access-9jdgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.033891 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.033941 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jdgf\" (UniqueName: \"kubernetes.io/projected/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-kube-api-access-9jdgf\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.035887 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"382b2b09-8110-411f-9d86-53e73df67fe6","Type":"ContainerStarted","Data":"0957c13dd699676e302e317d3b318e4868bbf1a229575490427f1b196e9804a6"} Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.038974 4898 generic.go:334] "Generic (PLEG): container finished" podID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" exitCode=0 Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039010 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd"} Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039036 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jndb5" event={"ID":"2157e8bf-88a5-4e48-b621-1744dcf0fcdb","Type":"ContainerDied","Data":"ad1f1863b9e188c98c1dfb8a88d0df8d02f680dd682644703708970a9e0dc172"} Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039055 4898 scope.go:117] "RemoveContainer" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.039153 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jndb5" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.066597 4898 scope.go:117] "RemoveContainer" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.077358 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2157e8bf-88a5-4e48-b621-1744dcf0fcdb" (UID: "2157e8bf-88a5-4e48-b621-1744dcf0fcdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.104316 4898 scope.go:117] "RemoveContainer" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.136664 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157e8bf-88a5-4e48-b621-1744dcf0fcdb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.141984 4898 scope.go:117] "RemoveContainer" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" Mar 13 15:17:03 crc kubenswrapper[4898]: E0313 15:17:03.142467 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd\": container with ID starting with d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd not found: ID does not exist" containerID="d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142504 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd"} err="failed to get container status \"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd\": rpc error: code = NotFound desc = could not find container \"d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd\": container with ID starting with d653e06a922f0f9a98f01d37cd3c761f9af6e7d070c7637908c781d22c20c1cd not found: ID does not exist" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142529 4898 scope.go:117] "RemoveContainer" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" Mar 13 15:17:03 crc kubenswrapper[4898]: E0313 15:17:03.142840 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43\": container with ID starting with 3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43 not found: ID does not exist" containerID="3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142879 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43"} err="failed to get container status \"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43\": rpc error: code = NotFound desc = could not find container \"3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43\": container with ID starting with 3efffca3639dac775d206dcb096dfcdfa29405be3f4a7da8f99d544be88ffc43 not found: ID does not exist" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.142921 4898 scope.go:117] "RemoveContainer" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" Mar 13 15:17:03 crc kubenswrapper[4898]: E0313 15:17:03.143160 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22\": container with ID starting with 8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22 not found: ID does not exist" containerID="8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.143182 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22"} err="failed to get container status \"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22\": rpc error: code = NotFound desc = could not find container \"8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22\": container with ID starting with 8fdeec86e0943d0ef27683a7197e100bc92b99eb865ac5c8b1d099f233220e22 not found: ID does not exist" Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.401978 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.411889 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jndb5"] Mar 13 15:17:03 crc kubenswrapper[4898]: I0313 15:17:03.757457 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" path="/var/lib/kubelet/pods/2157e8bf-88a5-4e48-b621-1744dcf0fcdb/volumes" Mar 13 15:17:04 crc kubenswrapper[4898]: I0313 15:17:04.939728 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:04 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:04 crc kubenswrapper[4898]: > Mar 13 15:17:09 crc kubenswrapper[4898]: I0313 15:17:09.127044 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"382b2b09-8110-411f-9d86-53e73df67fe6","Type":"ContainerStarted","Data":"9cb6f19deab8d2857d19039413245405ed7b87fda2a496b02c59f548e5f3e635"} Mar 13 15:17:09 crc kubenswrapper[4898]: I0313 15:17:09.144661 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=4.326225526 podStartE2EDuration="9.144642843s" podCreationTimestamp="2026-03-13 15:17:00 +0000 UTC" firstStartedPulling="2026-03-13 15:17:02.277809149 +0000 UTC m=+4857.279397388" lastFinishedPulling="2026-03-13 15:17:07.096226466 +0000 UTC m=+4862.097814705" observedRunningTime="2026-03-13 15:17:09.1409627 +0000 UTC m=+4864.142550959" watchObservedRunningTime="2026-03-13 15:17:09.144642843 +0000 UTC m=+4864.146231082" Mar 13 15:17:14 crc kubenswrapper[4898]: I0313 15:17:14.942921 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:14 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:14 crc kubenswrapper[4898]: > Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.133876 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.134496 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.134542 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.135544 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:17:19 crc kubenswrapper[4898]: I0313 15:17:19.135592 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1" gracePeriod=600 Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.249745 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1" exitCode=0 Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.249825 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1"} Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.250403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668"} Mar 13 15:17:20 crc kubenswrapper[4898]: I0313 15:17:20.250429 4898 scope.go:117] "RemoveContainer" containerID="23592f71b1b6a0588b20dac6d06bd2510a7ea0a477247e02fd4eeae75f83ee8a" Mar 13 15:17:24 crc kubenswrapper[4898]: I0313 15:17:24.948303 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:24 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:24 crc kubenswrapper[4898]: > Mar 13 15:17:34 crc kubenswrapper[4898]: I0313 15:17:34.935302 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:34 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:34 crc kubenswrapper[4898]: > Mar 13 15:17:40 crc kubenswrapper[4898]: I0313 15:17:40.832462 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:17:40 crc kubenswrapper[4898]: I0313 15:17:40.833192 4898 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f" containerName="galera" probeResult="failure" output="command timed out" Mar 13 15:17:44 crc kubenswrapper[4898]: I0313 15:17:44.934109 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" probeResult="failure" output=< Mar 13 15:17:44 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:17:44 crc kubenswrapper[4898]: > Mar 13 15:17:47 crc kubenswrapper[4898]: I0313 15:17:47.565664 4898 scope.go:117] "RemoveContainer" containerID="ba349dae26dd37d5b178e79f9ed4076346dab25c90569fb520b86b35b588e387" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.669051 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:17:49 crc kubenswrapper[4898]: E0313 15:17:49.673209 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-content" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673236 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-content" Mar 13 15:17:49 crc kubenswrapper[4898]: E0313 15:17:49.673280 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-utilities" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673287 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="extract-utilities" Mar 13 15:17:49 crc kubenswrapper[4898]: E0313 15:17:49.673304 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673311 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.673708 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="2157e8bf-88a5-4e48-b621-1744dcf0fcdb" containerName="registry-server" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.675040 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.677273 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-b6mrl"/"default-dockercfg-bppfx" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.683706 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b6mrl"/"openshift-service-ca.crt" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.686585 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-b6mrl"/"kube-root-ca.crt" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.697658 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.761148 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.761259 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.863600 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.863718 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.864840 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:49 crc kubenswrapper[4898]: I0313 15:17:49.892067 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"must-gather-cklv9\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:50 crc kubenswrapper[4898]: I0313 15:17:50.008536 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:17:51 crc kubenswrapper[4898]: I0313 15:17:51.041036 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:17:51 crc kubenswrapper[4898]: I0313 15:17:51.644287 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerStarted","Data":"197380797ad59a26356f5d68d34c739e92ec17b270095e2411ef7c9eee557eaa"} Mar 13 15:17:54 crc kubenswrapper[4898]: I0313 15:17:54.034007 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:17:54 crc kubenswrapper[4898]: I0313 15:17:54.103444 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:17:54 crc kubenswrapper[4898]: I0313 15:17:54.286878 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:17:55 crc kubenswrapper[4898]: I0313 15:17:55.695564 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dnh48" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" containerID="cri-o://e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde" gracePeriod=2 Mar 13 15:17:56 crc kubenswrapper[4898]: I0313 15:17:56.720009 4898 generic.go:334] "Generic (PLEG): container finished" podID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerID="e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde" exitCode=0 Mar 13 15:17:56 crc kubenswrapper[4898]: I0313 15:17:56.720077 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde"} Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.221533 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.225175 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.228982 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.229455 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.229599 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.241547 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.393257 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"auto-csr-approver-29556918-gcw8c\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.496108 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"auto-csr-approver-29556918-gcw8c\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.522045 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"auto-csr-approver-29556918-gcw8c\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:00 crc kubenswrapper[4898]: I0313 15:18:00.550573 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.624655 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:18:01 crc kubenswrapper[4898]: W0313 15:18:01.703235 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e19347_9341_49c0_9195_97e383796cb3.slice/crio-6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced WatchSource:0}: Error finding container 6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced: Status 404 returned error can't find the container with id 6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.723404 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") pod \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.724215 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") pod \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.724261 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") pod \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\" (UID: \"cb46c8b0-a6a9-4b6d-86a1-8408793887e5\") " Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.724881 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities" (OuterVolumeSpecName: "utilities") pod "cb46c8b0-a6a9-4b6d-86a1-8408793887e5" (UID: "cb46c8b0-a6a9-4b6d-86a1-8408793887e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.725254 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.730692 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.731257 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg" (OuterVolumeSpecName: "kube-api-access-mnffg") pod "cb46c8b0-a6a9-4b6d-86a1-8408793887e5" (UID: "cb46c8b0-a6a9-4b6d-86a1-8408793887e5"). InnerVolumeSpecName "kube-api-access-mnffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.817513 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerStarted","Data":"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.817670 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerStarted","Data":"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.828275 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dnh48" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.828210 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dnh48" event={"ID":"cb46c8b0-a6a9-4b6d-86a1-8408793887e5","Type":"ContainerDied","Data":"ef80ef6757d050f773b7a3c8ba863e9ba495da2f0964e3cb0f243834ede62c6d"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.832451 4898 scope.go:117] "RemoveContainer" containerID="e863d80604f47eb74bf43cf0b02ad573ba531995e209d5a2d54eb9c4f9740dde" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.833313 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6mrl/must-gather-cklv9" podStartSLOduration=2.805858121 podStartE2EDuration="12.833296111s" podCreationTimestamp="2026-03-13 15:17:49 +0000 UTC" firstStartedPulling="2026-03-13 15:17:51.084394649 +0000 UTC m=+4906.085982888" lastFinishedPulling="2026-03-13 15:18:01.111832629 +0000 UTC m=+4916.113420878" observedRunningTime="2026-03-13 15:18:01.821687217 +0000 UTC m=+4916.823275466" watchObservedRunningTime="2026-03-13 15:18:01.833296111 +0000 UTC m=+4916.834884350" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.835286 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerStarted","Data":"6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced"} Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.842236 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnffg\" (UniqueName: \"kubernetes.io/projected/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-kube-api-access-mnffg\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.865228 4898 scope.go:117] "RemoveContainer" containerID="aebe307d7887ba7796f4fad329402cc88b0d4332f82fab0c321325f23bf1adea" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.893288 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb46c8b0-a6a9-4b6d-86a1-8408793887e5" (UID: "cb46c8b0-a6a9-4b6d-86a1-8408793887e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.900128 4898 scope.go:117] "RemoveContainer" containerID="044e1dbede2644956e1b9f4f9606342f16b9eba6e865673613710b9380c20e93" Mar 13 15:18:01 crc kubenswrapper[4898]: I0313 15:18:01.950186 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb46c8b0-a6a9-4b6d-86a1-8408793887e5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:02 crc kubenswrapper[4898]: I0313 15:18:02.171207 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:18:02 crc kubenswrapper[4898]: I0313 15:18:02.181736 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dnh48"] Mar 13 15:18:03 crc kubenswrapper[4898]: I0313 15:18:03.757011 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" path="/var/lib/kubelet/pods/cb46c8b0-a6a9-4b6d-86a1-8408793887e5/volumes" Mar 13 15:18:04 crc kubenswrapper[4898]: I0313 15:18:04.872867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerStarted","Data":"41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3"} Mar 13 15:18:04 crc kubenswrapper[4898]: I0313 15:18:04.895785 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" podStartSLOduration=4.092667867 podStartE2EDuration="4.895763725s" podCreationTimestamp="2026-03-13 15:18:00 +0000 UTC" firstStartedPulling="2026-03-13 15:18:01.706413295 +0000 UTC m=+4916.708001534" lastFinishedPulling="2026-03-13 15:18:02.509509153 +0000 UTC m=+4917.511097392" observedRunningTime="2026-03-13 15:18:04.886071184 +0000 UTC m=+4919.887659443" watchObservedRunningTime="2026-03-13 15:18:04.895763725 +0000 UTC m=+4919.897351974" Mar 13 15:18:05 crc kubenswrapper[4898]: I0313 15:18:05.888943 4898 generic.go:334] "Generic (PLEG): container finished" podID="85e19347-9341-49c0-9195-97e383796cb3" containerID="41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3" exitCode=0 Mar 13 15:18:05 crc kubenswrapper[4898]: I0313 15:18:05.889031 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerDied","Data":"41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3"} Mar 13 15:18:07 crc kubenswrapper[4898]: E0313 15:18:07.287520 4898 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.201:53544->38.102.83.201:43395: write tcp 38.102.83.201:53544->38.102.83.201:43395: write: broken pipe Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.329531 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.352139 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") pod \"85e19347-9341-49c0-9195-97e383796cb3\" (UID: \"85e19347-9341-49c0-9195-97e383796cb3\") " Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.358393 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w" (OuterVolumeSpecName: "kube-api-access-lr25w") pod "85e19347-9341-49c0-9195-97e383796cb3" (UID: "85e19347-9341-49c0-9195-97e383796cb3"). InnerVolumeSpecName "kube-api-access-lr25w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.455371 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr25w\" (UniqueName: \"kubernetes.io/projected/85e19347-9341-49c0-9195-97e383796cb3-kube-api-access-lr25w\") on node \"crc\" DevicePath \"\"" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.914844 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" event={"ID":"85e19347-9341-49c0-9195-97e383796cb3","Type":"ContainerDied","Data":"6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced"} Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.914892 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cd199ad916b31e7f29a9349f409a7ea25e42056bbd3d20e1aa0b88f98b05ced" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.914968 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556918-gcw8c" Mar 13 15:18:07 crc kubenswrapper[4898]: I0313 15:18:07.968838 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.005232 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556912-5fmmb"] Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.102992 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-nj5tr"] Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103743 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-content" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103768 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-content" Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103805 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-utilities" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103814 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="extract-utilities" Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103847 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103855 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" Mar 13 15:18:08 crc kubenswrapper[4898]: E0313 15:18:08.103883 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e19347-9341-49c0-9195-97e383796cb3" containerName="oc" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.103890 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e19347-9341-49c0-9195-97e383796cb3" containerName="oc" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.104183 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e19347-9341-49c0-9195-97e383796cb3" containerName="oc" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.104215 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb46c8b0-a6a9-4b6d-86a1-8408793887e5" containerName="registry-server" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.105159 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.170853 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.171323 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.273303 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.273728 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.275573 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.290426 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"crc-debug-nj5tr\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.423632 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:18:08 crc kubenswrapper[4898]: I0313 15:18:08.936590 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" event={"ID":"e373e338-4b01-4034-a9a8-186e53b74e76","Type":"ContainerStarted","Data":"d463fc3e5f4f8fa2a117295e0dae7a098377d85b8f032ed128d245e63f4a7497"} Mar 13 15:18:09 crc kubenswrapper[4898]: I0313 15:18:09.761528 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada9e0ac-777e-4e64-aade-d729b4481edf" path="/var/lib/kubelet/pods/ada9e0ac-777e-4e64-aade-d729b4481edf/volumes" Mar 13 15:18:22 crc kubenswrapper[4898]: I0313 15:18:22.078057 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" event={"ID":"e373e338-4b01-4034-a9a8-186e53b74e76","Type":"ContainerStarted","Data":"2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2"} Mar 13 15:18:22 crc kubenswrapper[4898]: I0313 15:18:22.100554 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" podStartSLOduration=1.6922150729999998 podStartE2EDuration="14.100529222s" podCreationTimestamp="2026-03-13 15:18:08 +0000 UTC" firstStartedPulling="2026-03-13 15:18:08.517806048 +0000 UTC m=+4923.519394287" lastFinishedPulling="2026-03-13 15:18:20.926120197 +0000 UTC m=+4935.927708436" observedRunningTime="2026-03-13 15:18:22.092367026 +0000 UTC m=+4937.093955285" watchObservedRunningTime="2026-03-13 15:18:22.100529222 +0000 UTC m=+4937.102117461" Mar 13 15:18:47 crc kubenswrapper[4898]: I0313 15:18:47.861989 4898 scope.go:117] "RemoveContainer" containerID="ac390920dd30738022bc9982651d5e7fa6b628c845272ba7744c86bf9f8444e9" Mar 13 15:18:51 crc kubenswrapper[4898]: I0313 15:18:51.408507 4898 generic.go:334] "Generic (PLEG): container finished" podID="823ccfb8-89eb-409e-9c6c-579bacb35ea1" containerID="7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6" exitCode=0 Mar 13 15:18:51 crc kubenswrapper[4898]: I0313 15:18:51.408639 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerDied","Data":"7046ebdd84c06a54a3ad07946403d79aec9442bfc8bcd95a1021d0c00e48bec6"} Mar 13 15:18:51 crc kubenswrapper[4898]: I0313 15:18:51.408984 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" event={"ID":"823ccfb8-89eb-409e-9c6c-579bacb35ea1","Type":"ContainerStarted","Data":"6f1ae72cefa3d468668ba3267a70b8013c186367da3751e35d5758d5c1261a0a"} Mar 13 15:19:09 crc kubenswrapper[4898]: I0313 15:19:09.081741 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:09 crc kubenswrapper[4898]: I0313 15:19:09.082349 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:10 crc kubenswrapper[4898]: I0313 15:19:10.916848 4898 generic.go:334] "Generic (PLEG): container finished" podID="e373e338-4b01-4034-a9a8-186e53b74e76" containerID="2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2" exitCode=0 Mar 13 15:19:10 crc kubenswrapper[4898]: I0313 15:19:10.916896 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" event={"ID":"e373e338-4b01-4034-a9a8-186e53b74e76","Type":"ContainerDied","Data":"2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2"} Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.100757 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.146729 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-nj5tr"] Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.156946 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-nj5tr"] Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.211925 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") pod \"e373e338-4b01-4034-a9a8-186e53b74e76\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.212225 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") pod \"e373e338-4b01-4034-a9a8-186e53b74e76\" (UID: \"e373e338-4b01-4034-a9a8-186e53b74e76\") " Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.212284 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host" (OuterVolumeSpecName: "host") pod "e373e338-4b01-4034-a9a8-186e53b74e76" (UID: "e373e338-4b01-4034-a9a8-186e53b74e76"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.213078 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e373e338-4b01-4034-a9a8-186e53b74e76-host\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.218423 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw" (OuterVolumeSpecName: "kube-api-access-7t9jw") pod "e373e338-4b01-4034-a9a8-186e53b74e76" (UID: "e373e338-4b01-4034-a9a8-186e53b74e76"). InnerVolumeSpecName "kube-api-access-7t9jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.315198 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9jw\" (UniqueName: \"kubernetes.io/projected/e373e338-4b01-4034-a9a8-186e53b74e76-kube-api-access-7t9jw\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.954551 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d463fc3e5f4f8fa2a117295e0dae7a098377d85b8f032ed128d245e63f4a7497" Mar 13 15:19:12 crc kubenswrapper[4898]: I0313 15:19:12.954653 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-nj5tr" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.382857 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-2x48p"] Mar 13 15:19:13 crc kubenswrapper[4898]: E0313 15:19:13.383415 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" containerName="container-00" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.383431 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" containerName="container-00" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.383689 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" containerName="container-00" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.384775 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.544422 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.544593 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.649438 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.649847 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.649979 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.674326 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"crc-debug-2x48p\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.703072 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.754338 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e373e338-4b01-4034-a9a8-186e53b74e76" path="/var/lib/kubelet/pods/e373e338-4b01-4034-a9a8-186e53b74e76/volumes" Mar 13 15:19:13 crc kubenswrapper[4898]: W0313 15:19:13.765514 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d1fb57_3717_403b_a93e_b29818cd7698.slice/crio-3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed WatchSource:0}: Error finding container 3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed: Status 404 returned error can't find the container with id 3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed Mar 13 15:19:13 crc kubenswrapper[4898]: I0313 15:19:13.981360 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" event={"ID":"41d1fb57-3717-403b-a93e-b29818cd7698","Type":"ContainerStarted","Data":"3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed"} Mar 13 15:19:15 crc kubenswrapper[4898]: I0313 15:19:15.000794 4898 generic.go:334] "Generic (PLEG): container finished" podID="41d1fb57-3717-403b-a93e-b29818cd7698" containerID="16ab5a558dadc1a752795f9720b3efadfbd0fc9afffa700ac13e16901fd9b881" exitCode=0 Mar 13 15:19:15 crc kubenswrapper[4898]: I0313 15:19:15.000932 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" event={"ID":"41d1fb57-3717-403b-a93e-b29818cd7698","Type":"ContainerDied","Data":"16ab5a558dadc1a752795f9720b3efadfbd0fc9afffa700ac13e16901fd9b881"} Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.160460 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.213047 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-2x48p"] Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.249238 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-2x48p"] Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.318297 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") pod \"41d1fb57-3717-403b-a93e-b29818cd7698\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.318467 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host" (OuterVolumeSpecName: "host") pod "41d1fb57-3717-403b-a93e-b29818cd7698" (UID: "41d1fb57-3717-403b-a93e-b29818cd7698"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.318521 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") pod \"41d1fb57-3717-403b-a93e-b29818cd7698\" (UID: \"41d1fb57-3717-403b-a93e-b29818cd7698\") " Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.319114 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/41d1fb57-3717-403b-a93e-b29818cd7698-host\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.324534 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v" (OuterVolumeSpecName: "kube-api-access-rjg4v") pod "41d1fb57-3717-403b-a93e-b29818cd7698" (UID: "41d1fb57-3717-403b-a93e-b29818cd7698"). InnerVolumeSpecName "kube-api-access-rjg4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:16 crc kubenswrapper[4898]: I0313 15:19:16.422569 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjg4v\" (UniqueName: \"kubernetes.io/projected/41d1fb57-3717-403b-a93e-b29818cd7698-kube-api-access-rjg4v\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.028209 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eaeea731b6d7736d11e4c22a781a0337eb9d1c5ac72176d1a8a73e7e358a5ed" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.028708 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-2x48p" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.428535 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-mwtj6"] Mar 13 15:19:17 crc kubenswrapper[4898]: E0313 15:19:17.429099 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" containerName="container-00" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.429117 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" containerName="container-00" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.429428 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" containerName="container-00" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.430430 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.550131 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.550738 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.653570 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.653716 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.653760 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:17 crc kubenswrapper[4898]: I0313 15:19:17.755794 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d1fb57-3717-403b-a93e-b29818cd7698" path="/var/lib/kubelet/pods/41d1fb57-3717-403b-a93e-b29818cd7698/volumes" Mar 13 15:19:18 crc kubenswrapper[4898]: I0313 15:19:18.131261 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"crc-debug-mwtj6\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:18 crc kubenswrapper[4898]: I0313 15:19:18.350877 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:18 crc kubenswrapper[4898]: W0313 15:19:18.380546 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36269787_31ca_4f5e_9044_edab989fec71.slice/crio-82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e WatchSource:0}: Error finding container 82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e: Status 404 returned error can't find the container with id 82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.050049 4898 generic.go:334] "Generic (PLEG): container finished" podID="36269787-31ca-4f5e-9044-edab989fec71" containerID="574ee975060e59643e7318bfce9cb30a5cb62d4a53140e192038f6b50584150c" exitCode=0 Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.050160 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" event={"ID":"36269787-31ca-4f5e-9044-edab989fec71","Type":"ContainerDied","Data":"574ee975060e59643e7318bfce9cb30a5cb62d4a53140e192038f6b50584150c"} Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.050410 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" event={"ID":"36269787-31ca-4f5e-9044-edab989fec71","Type":"ContainerStarted","Data":"82b69365c504b9163413bfdb884a969cd476d34070236577bb1192adf7f82a8e"} Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.103756 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-mwtj6"] Mar 13 15:19:19 crc kubenswrapper[4898]: I0313 15:19:19.120416 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/crc-debug-mwtj6"] Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.236160 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.317784 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") pod \"36269787-31ca-4f5e-9044-edab989fec71\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.317957 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") pod \"36269787-31ca-4f5e-9044-edab989fec71\" (UID: \"36269787-31ca-4f5e-9044-edab989fec71\") " Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.318577 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host" (OuterVolumeSpecName: "host") pod "36269787-31ca-4f5e-9044-edab989fec71" (UID: "36269787-31ca-4f5e-9044-edab989fec71"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.324432 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst" (OuterVolumeSpecName: "kube-api-access-hjkst") pod "36269787-31ca-4f5e-9044-edab989fec71" (UID: "36269787-31ca-4f5e-9044-edab989fec71"). InnerVolumeSpecName "kube-api-access-hjkst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.421997 4898 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36269787-31ca-4f5e-9044-edab989fec71-host\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:20 crc kubenswrapper[4898]: I0313 15:19:20.422039 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjkst\" (UniqueName: \"kubernetes.io/projected/36269787-31ca-4f5e-9044-edab989fec71-kube-api-access-hjkst\") on node \"crc\" DevicePath \"\"" Mar 13 15:19:21 crc kubenswrapper[4898]: I0313 15:19:21.082589 4898 scope.go:117] "RemoveContainer" containerID="574ee975060e59643e7318bfce9cb30a5cb62d4a53140e192038f6b50584150c" Mar 13 15:19:21 crc kubenswrapper[4898]: I0313 15:19:21.082614 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/crc-debug-mwtj6" Mar 13 15:19:21 crc kubenswrapper[4898]: I0313 15:19:21.752836 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36269787-31ca-4f5e-9044-edab989fec71" path="/var/lib/kubelet/pods/36269787-31ca-4f5e-9044-edab989fec71/volumes" Mar 13 15:19:29 crc kubenswrapper[4898]: I0313 15:19:29.087365 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:29 crc kubenswrapper[4898]: I0313 15:19:29.093047 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7b77fdd7dd-vwwfr" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.055445 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-api/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.133985 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.134055 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.343762 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-listener/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.381606 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-evaluator/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.405131 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_a27645af-4d4a-4a73-ba8a-488a9ae199ac/aodh-notifier/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.555627 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b9dc95d4b-bvhlz_fd4bf680-c8b7-4721-9595-9a8ed40410d2/barbican-api/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.627396 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b9dc95d4b-bvhlz_fd4bf680-c8b7-4721-9595-9a8ed40410d2/barbican-api-log/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.705505 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fcdc98bd8-xdl6x_272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2/barbican-keystone-listener/0.log" Mar 13 15:19:49 crc kubenswrapper[4898]: I0313 15:19:49.862682 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-fcdc98bd8-xdl6x_272aa2e8-f1ed-4a08-b5a3-aecd06c4c6d2/barbican-keystone-listener-log/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.564934 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-795749dc8c-sm2hl_8b16e588-d353-4100-b143-b84420c42e30/barbican-worker/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.629811 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-795749dc8c-sm2hl_8b16e588-d353-4100-b143-b84420c42e30/barbican-worker-log/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.839684 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xphxf_6d8bbc5a-39da-48b8-82d1-6df496fda612/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:50 crc kubenswrapper[4898]: I0313 15:19:50.864596 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/ceilometer-central-agent/1.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.015194 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/ceilometer-central-agent/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.065971 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/ceilometer-notification-agent/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.107109 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/proxy-httpd/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.127783 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_02f7d483-aecb-4a39-babc-6d9598090c4b/sg-core/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.312033 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bda33d23-490a-4099-954b-c613ab5d5c73/cinder-api-log/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.356919 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_bda33d23-490a-4099-954b-c613ab5d5c73/cinder-api/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.563615 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9a7064c-4ed5-4948-9e7e-7d40794e371e/cinder-scheduler/1.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.603432 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9a7064c-4ed5-4948-9e7e-7d40794e371e/cinder-scheduler/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.650521 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_a9a7064c-4ed5-4948-9e7e-7d40794e371e/probe/0.log" Mar 13 15:19:51 crc kubenswrapper[4898]: I0313 15:19:51.786704 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-57hwn_295e7c32-75f1-4eee-a126-2d4547c56f24/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.528478 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5s7mz_ac094822-6272-4730-ab0b-16f0116426b5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.592066 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-k4ntr_dd51a575-1651-4891-941f-3e0fe447e81d/init/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.805471 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-k4ntr_dd51a575-1651-4891-941f-3e0fe447e81d/init/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.886926 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-k4ntr_dd51a575-1651-4891-941f-3e0fe447e81d/dnsmasq-dns/0.log" Mar 13 15:19:52 crc kubenswrapper[4898]: I0313 15:19:52.906594 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-rbttg_05e315eb-34b1-4099-b676-b0238f3cb5c5/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.103235 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a7cdbc1c-79cc-441b-a08c-c61b717d82c9/glance-log/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.128880 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a7cdbc1c-79cc-441b-a08c-c61b717d82c9/glance-httpd/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.309847 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f666d519-2c39-4e93-823d-e5a3fcfd0d5a/glance-log/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.418556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_f666d519-2c39-4e93-823d-e5a3fcfd0d5a/glance-httpd/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.934247 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-648cbb8b5f-4kb5b_739e9c4a-9843-4edf-a045-2f7ef8d15b5e/heat-api/0.log" Mar 13 15:19:53 crc kubenswrapper[4898]: I0313 15:19:53.962301 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6b446d7755-5724r_0f20ec1d-823e-4695-859e-bdc538e602d9/heat-engine/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.007529 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-v8np4_8c6bec5a-faac-4793-8c18-9f5b2faf2c95/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.159282 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5df9b5999-7tt4b_03c552ae-5860-4468-a612-7af3d3587df4/heat-cfnapi/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.230763 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tr644_8b4abb6a-5797-47be-96a0-69173649e5fa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.442617 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29556901-6pnrd_3d71da57-c929-47d7-89bd-8e4e3c7f3ca0/keystone-cron/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.496571 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_b7452a36-0169-4cfe-9ede-ef4d0ef072d9/kube-state-metrics/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.794420 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-tw885_efff948d-3073-4635-bc2c-2a8fc746c6b8/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:54 crc kubenswrapper[4898]: I0313 15:19:54.825474 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-6ww7z_226c01c4-d0f3-4784-8e93-36d1de6d593f/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.067240 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_8a198c14-e13f-4858-87c4-de6be0fa8d0c/mysqld-exporter/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.140102 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-87574c74-kqmjb_d149c7e3-df46-44b5-8a66-8a0fbb5a8554/keystone-api/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.412297 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-776df44c77-g64lv_4a679fb4-8d85-4835-a048-08c4b61aa158/neutron-api/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.500829 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-gbzb2_abb37cb2-ec06-4c96-882f-7781fbe053e0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:55 crc kubenswrapper[4898]: I0313 15:19:55.526826 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-776df44c77-g64lv_4a679fb4-8d85-4835-a048-08c4b61aa158/neutron-httpd/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.209546 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ef7dd576-1005-4fdb-95c1-e5da9f04b177/nova-api-log/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.218188 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9796fb40-37f0-4d8a-929f-4bb6295388a4/nova-cell0-conductor-conductor/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.489046 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_50cbae0e-4bf9-41b0-8c87-b551f782aecf/nova-cell1-conductor-conductor/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.580077 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_041221f0-b346-4310-ab8e-a8f2440c6034/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.678836 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_ef7dd576-1005-4fdb-95c1-e5da9f04b177/nova-api-api/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.851717 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-28xpg_acaa3912-3e27-4272-8e4a-3ab67fd34b92/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:56 crc kubenswrapper[4898]: I0313 15:19:56.956406 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17/nova-metadata-log/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.394459 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_97d388e1-b1b3-409d-b7c5-38b37734a8e6/nova-scheduler-scheduler/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.611701 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_8a0631b3-ea9b-4e0a-bb3e-35b283f1ad17/nova-metadata-metadata/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.662524 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/mysql-bootstrap/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.966652 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/galera/0.log" Mar 13 15:19:57 crc kubenswrapper[4898]: I0313 15:19:57.979214 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/mysql-bootstrap/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.108717 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6050c765-4eb7-425f-bfa6-ffdf7fb3bc2f/galera/1.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.198669 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/mysql-bootstrap/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.490755 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/mysql-bootstrap/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.546772 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/galera/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.630695 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e5d53cf3-113e-4391-b3a9-4e1f81836e26/galera/1.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.793286 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_124bd4ee-d9f0-408f-a46e-4d143e8ab02a/openstackclient/0.log" Mar 13 15:19:58 crc kubenswrapper[4898]: I0313 15:19:58.811219 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-j79bj_a506ef1a-354a-49c8-b63d-4db4b9ecdcfe/ovn-controller/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.132691 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8mxxb_515cda05-1d7b-4252-94fc-056b38ec502a/openstack-network-exporter/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.170609 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovsdb-server-init/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.338711 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovsdb-server-init/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.380806 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovs-vswitchd/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.489388 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-r9tmf_f71b72a8-f179-454c-8d2e-4ac829842622/ovsdb-server/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.567191 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-6xdgx_a9f7be15-746c-45be-92a1-2fa2a961f636/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.730885 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_902753c9-2101-4509-9283-55070ac3787e/openstack-network-exporter/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.735435 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_902753c9-2101-4509-9283-55070ac3787e/ovn-northd/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.926404 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10/openstack-network-exporter/0.log" Mar 13 15:19:59 crc kubenswrapper[4898]: I0313 15:19:59.982550 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_280d47a9-b4a1-4fea-9bb1-6f9d1bb8ca10/ovsdbserver-nb/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.158270 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:20:00 crc kubenswrapper[4898]: E0313 15:20:00.160088 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36269787-31ca-4f5e-9044-edab989fec71" containerName="container-00" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.160105 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="36269787-31ca-4f5e-9044-edab989fec71" containerName="container-00" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.160794 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="36269787-31ca-4f5e-9044-edab989fec71" containerName="container-00" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.163955 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.175416 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.180870 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.181064 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.183002 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.224300 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_111bf23f-be00-46ab-97fe-a36465735164/openstack-network-exporter/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.258315 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_111bf23f-be00-46ab-97fe-a36465735164/ovsdbserver-sb/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.302859 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"auto-csr-approver-29556920-jxmxv\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.404688 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"auto-csr-approver-29556920-jxmxv\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.440762 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"auto-csr-approver-29556920-jxmxv\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.448284 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-647f998784-xvcjw_fa7825b5-b19b-44bb-8d23-bb121e669780/placement-api/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.517571 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.631868 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/init-config-reloader/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.679440 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-647f998784-xvcjw_fa7825b5-b19b-44bb-8d23-bb121e669780/placement-log/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.926890 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/thanos-sidecar/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.939878 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/init-config-reloader/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.974559 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/prometheus/0.log" Mar 13 15:20:00 crc kubenswrapper[4898]: I0313 15:20:00.995159 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d555bd54-f4d5-4b06-9517-32b4fe687f4b/config-reloader/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.164400 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.191370 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.452107 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.533108 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6388d8ea-aeb6-4ca7-8f8e-cbf98d6f8835/rabbitmq/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.572026 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10c321a0-5ea5-4b5c-8695-1f7b2dcad32b/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.664233 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerStarted","Data":"da6e0c31d7181dd8304aed4238096af773ce4f2619cef6ffbbe314456d9083a4"} Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.850965 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10c321a0-5ea5-4b5c-8695-1f7b2dcad32b/setup-container/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.918955 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10c321a0-5ea5-4b5c-8695-1f7b2dcad32b/rabbitmq/0.log" Mar 13 15:20:01 crc kubenswrapper[4898]: I0313 15:20:01.922535 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_ec19264c-1313-492d-b59b-4e5916b988f5/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.190446 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_ec19264c-1313-492d-b59b-4e5916b988f5/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.218148 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_ec19264c-1313-492d-b59b-4e5916b988f5/rabbitmq/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.233097 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8d188301-848c-4cf6-a204-e1110714c1be/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.409943 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8d188301-848c-4cf6-a204-e1110714c1be/setup-container/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.501386 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-r7rs2_8a674c4a-b209-4ea0-83b0-c46f820a81ef/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.587455 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_8d188301-848c-4cf6-a204-e1110714c1be/rabbitmq/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.731351 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-g6vnq_6329b434-b1be-4490-9a50-351366b18d79/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:02 crc kubenswrapper[4898]: I0313 15:20:02.861978 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5gjfs_98336335-4b60-4ddf-8fe8-4ea6b69d47ef/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.006990 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-sjx4b_05d3f0e4-c029-4e2f-a3c1-471faa671767/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.141586 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x8wvs_e7c70549-1fc7-42c2-8c81-075c611671ae/ssh-known-hosts-edpm-deployment/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.447669 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f9cbdc5df-5tx5z_1a57db04-0dc9-4d63-8d08-dd4309b19496/proxy-server/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.556987 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-ztbp9_4a6f0bfb-5db5-440c-a93f-0d6fe159401d/swift-ring-rebalance/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.583408 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7f9cbdc5df-5tx5z_1a57db04-0dc9-4d63-8d08-dd4309b19496/proxy-httpd/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.804693 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-reaper/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.807187 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-auditor/0.log" Mar 13 15:20:03 crc kubenswrapper[4898]: I0313 15:20:03.950733 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-replicator/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.007601 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/account-server/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.034325 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-auditor/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.127736 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-replicator/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.216336 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-server/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.236320 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/container-updater/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.314555 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-auditor/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.382646 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-expirer/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.513640 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-server/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.527805 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-replicator/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.576854 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/object-updater/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.676359 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/rsync/0.log" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.713139 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerStarted","Data":"897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4"} Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.732532 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" podStartSLOduration=2.72588178 podStartE2EDuration="4.732508737s" podCreationTimestamp="2026-03-13 15:20:00 +0000 UTC" firstStartedPulling="2026-03-13 15:20:01.181286201 +0000 UTC m=+5036.182874440" lastFinishedPulling="2026-03-13 15:20:03.187913158 +0000 UTC m=+5038.189501397" observedRunningTime="2026-03-13 15:20:04.725253395 +0000 UTC m=+5039.726841634" watchObservedRunningTime="2026-03-13 15:20:04.732508737 +0000 UTC m=+5039.734096986" Mar 13 15:20:04 crc kubenswrapper[4898]: I0313 15:20:04.757601 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_794bd82b-e289-4b31-b0cf-f1285452e783/swift-recon-cron/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.033040 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-nd5lk_9a62fd58-a586-4473-abfe-4e227cad9900/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.116518 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-2ksk6_5139c85e-1d3d-4fe7-94aa-8efde03b43e0/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.343250 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_382b2b09-8110-411f-9d86-53e73df67fe6/test-operator-logs-container/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.596703 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-wxr7c_271e9163-4e9c-4c79-a0b4-be373e97956c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.693542 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d19e8770-f0c1-491e-96c9-f737386ab3b0/tempest-tests-tempest-tests-runner/0.log" Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.726067 4898 generic.go:334] "Generic (PLEG): container finished" podID="6caf987f-dbe2-48d2-8138-107de40fe224" containerID="897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4" exitCode=0 Mar 13 15:20:05 crc kubenswrapper[4898]: I0313 15:20:05.726118 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerDied","Data":"897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4"} Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.176689 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.323410 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") pod \"6caf987f-dbe2-48d2-8138-107de40fe224\" (UID: \"6caf987f-dbe2-48d2-8138-107de40fe224\") " Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.330635 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv" (OuterVolumeSpecName: "kube-api-access-vl5gv") pod "6caf987f-dbe2-48d2-8138-107de40fe224" (UID: "6caf987f-dbe2-48d2-8138-107de40fe224"). InnerVolumeSpecName "kube-api-access-vl5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.426276 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl5gv\" (UniqueName: \"kubernetes.io/projected/6caf987f-dbe2-48d2-8138-107de40fe224-kube-api-access-vl5gv\") on node \"crc\" DevicePath \"\"" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.757300 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.760867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556920-jxmxv" event={"ID":"6caf987f-dbe2-48d2-8138-107de40fe224","Type":"ContainerDied","Data":"da6e0c31d7181dd8304aed4238096af773ce4f2619cef6ffbbe314456d9083a4"} Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.760935 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da6e0c31d7181dd8304aed4238096af773ce4f2619cef6ffbbe314456d9083a4" Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.841883 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:20:07 crc kubenswrapper[4898]: I0313 15:20:07.852940 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556914-52q7k"] Mar 13 15:20:09 crc kubenswrapper[4898]: I0313 15:20:09.757206 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad898ac1-9e95-4eb8-a88b-927e3d6364f6" path="/var/lib/kubelet/pods/ad898ac1-9e95-4eb8-a88b-927e3d6364f6/volumes" Mar 13 15:20:19 crc kubenswrapper[4898]: I0313 15:20:19.133883 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:20:19 crc kubenswrapper[4898]: I0313 15:20:19.134461 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:20:20 crc kubenswrapper[4898]: I0313 15:20:20.189233 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_67ef28b0-acc3-400e-8296-a541fc3b89f0/memcached/0.log" Mar 13 15:20:38 crc kubenswrapper[4898]: I0313 15:20:38.740292 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-gtlps_45efd8ce-26db-4511-bd88-2e7467d02bbb/manager/0.log" Mar 13 15:20:38 crc kubenswrapper[4898]: I0313 15:20:38.952949 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-4n5rx_3c955ebc-98fd-4921-9923-6151a50e8eec/manager/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.216630 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/util/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.452406 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/util/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.454302 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/pull/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.454952 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/pull/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.722585 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/util/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.758535 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/pull/0.log" Mar 13 15:20:39 crc kubenswrapper[4898]: I0313 15:20:39.775718 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_fde5bd3580cf5bff9a9984f67cb14ffd9a5909a2de508af4ad53062896dgm57_a57932fc-ce83-4258-95a8-65f29c0cfd5a/extract/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.100074 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-mf8h6_fb7b2f97-fca8-41d2-9be7-d40fac94c171/manager/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.374891 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-tqp4b_ea0ad033-9a48-4e42-a237-f27cacf03adc/manager/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.407618 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-jngrl_a80d01d5-0201-4b2e-974c-ac5b42ac8df4/manager/1.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.720994 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-p9d5v_0d88a5d2-a852-409e-b4bd-939d1c2b9090/manager/0.log" Mar 13 15:20:40 crc kubenswrapper[4898]: I0313 15:20:40.728254 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-jngrl_a80d01d5-0201-4b2e-974c-ac5b42ac8df4/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.003532 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-v99bm_32b5ebfd-38d9-456e-bb21-7332323239d1/manager/1.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.059122 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-v99bm_32b5ebfd-38d9-456e-bb21-7332323239d1/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.120205 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-8kcsw_c35de09d-7f21-47d3-aac5-a26b15b0a496/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.366357 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-s5zh6_d24bb749-0b71-456b-80e4-fdf6dd23ba30/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.412268 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-z2gd2_1df4a7d6-b0c2-4b00-b591-1a612bd319b6/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.582006 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-2lnlc_ba56f415-73d5-4301-a25d-0e5d1ba4e3b1/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.695345 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-ntlw6_d71982c0-a3d0-4da8-84cd-7494301f589f/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.881380 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-s2rdh_d29ce3ee-3d5a-4801-abf9-dfef5b641a74/manager/1.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.923916 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-mr4wv_52959483-daae-423a-a3bf-8e3fa7810074/manager/0.log" Mar 13 15:20:41 crc kubenswrapper[4898]: I0313 15:20:41.961267 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-s2rdh_d29ce3ee-3d5a-4801-abf9-dfef5b641a74/manager/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.089744 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv_0ab852e1-fd26-4f76-b758-77896f8e236b/manager/1.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.177018 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7vq7xv_0ab852e1-fd26-4f76-b758-77896f8e236b/manager/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.453556 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6b8c6b5df9-kk2gn_7bae49ab-1146-43a2-b436-69838c923f1a/operator/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.678375 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-9k7p6_478795f5-c2f6-4e9b-9ed6-e2c743c3f3b8/registry-server/0.log" Mar 13 15:20:42 crc kubenswrapper[4898]: I0313 15:20:42.860251 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-wdmrh_da3795a7-363f-4637-afe2-77cb77248f9a/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.006924 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-njsvh_0d7c657b-a701-41fe-9b23-d5bba3302c4f/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.380679 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-82gtc_7b9c0413-5558-43c4-805b-7f035fded9b4/operator/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.438843 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-f2t6t_66a86c31-9ff3-439a-a0f8-96c981014b6f/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.726395 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-smdkt_19a0f4de-5258-4f2b-9587-71293459378e/manager/1.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.867682 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-smdkt_19a0f4de-5258-4f2b-9587-71293459378e/manager/0.log" Mar 13 15:20:43 crc kubenswrapper[4898]: I0313 15:20:43.991732 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-jwrd2_919747b8-a031-4654-999f-3c3928f981b4/manager/0.log" Mar 13 15:20:44 crc kubenswrapper[4898]: I0313 15:20:44.048382 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5b9fbd87f-s2k96_9ff6f89a-7110-42fb-96b9-8611f280bebe/manager/0.log" Mar 13 15:20:44 crc kubenswrapper[4898]: I0313 15:20:44.236547 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f7dc44db6-9nsrh_3a26728d-85c2-465c-bce4-c74045ea9e0d/manager/0.log" Mar 13 15:20:48 crc kubenswrapper[4898]: I0313 15:20:48.407501 4898 scope.go:117] "RemoveContainer" containerID="7e1660e6d2126df6f52c14a1146a22533f711db09e61120239f48e0f06547dd2" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.134292 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.134663 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.134735 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.136269 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:20:49 crc kubenswrapper[4898]: I0313 15:20:49.136374 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" gracePeriod=600 Mar 13 15:20:49 crc kubenswrapper[4898]: E0313 15:20:49.295801 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.283976 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" exitCode=0 Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.284018 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668"} Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.284277 4898 scope.go:117] "RemoveContainer" containerID="bf9fa5bb76f8bd5a010d026caf62189a87b342669ddb0345c62f785750fd30c1" Mar 13 15:20:50 crc kubenswrapper[4898]: I0313 15:20:50.285686 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:20:50 crc kubenswrapper[4898]: E0313 15:20:50.286160 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:01 crc kubenswrapper[4898]: I0313 15:21:01.739733 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:01 crc kubenswrapper[4898]: E0313 15:21:01.740790 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:06 crc kubenswrapper[4898]: I0313 15:21:06.473324 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qt7gm_6444bf97-84ef-49df-afcd-4e939a5de2ad/control-plane-machine-set-operator/0.log" Mar 13 15:21:06 crc kubenswrapper[4898]: I0313 15:21:06.502763 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-whtgq_096d3786-85e8-4fe5-82b3-57cd1be251a1/kube-rbac-proxy/0.log" Mar 13 15:21:06 crc kubenswrapper[4898]: I0313 15:21:06.921093 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-whtgq_096d3786-85e8-4fe5-82b3-57cd1be251a1/machine-api-operator/0.log" Mar 13 15:21:14 crc kubenswrapper[4898]: I0313 15:21:14.739197 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:14 crc kubenswrapper[4898]: E0313 15:21:14.739922 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:21 crc kubenswrapper[4898]: I0313 15:21:21.735775 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fsdjh_b267a865-1a03-4f37-9d2a-83380d30da1d/cert-manager-controller/0.log" Mar 13 15:21:21 crc kubenswrapper[4898]: I0313 15:21:21.894938 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-krzxz_d00b7135-a080-4f0e-a23b-237ab821410f/cert-manager-cainjector/0.log" Mar 13 15:21:21 crc kubenswrapper[4898]: I0313 15:21:21.992940 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cx9pw_7c1fa9c0-bb2e-4806-95fd-07fba426bdc8/cert-manager-webhook/0.log" Mar 13 15:21:29 crc kubenswrapper[4898]: I0313 15:21:29.740566 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:29 crc kubenswrapper[4898]: E0313 15:21:29.742815 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.454867 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-9m8s6_b707c4ee-39e1-4fc6-812a-f61e722c1079/nmstate-console-plugin/0.log" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.673742 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-fpgr7_e4761153-ed4e-4264-8f21-b4de31a4bbb8/nmstate-handler/0.log" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.753349 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c8fgd_35105fc0-dff0-4480-8635-cbbeec82d124/nmstate-metrics/0.log" Mar 13 15:21:38 crc kubenswrapper[4898]: I0313 15:21:38.778756 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-c8fgd_35105fc0-dff0-4480-8635-cbbeec82d124/kube-rbac-proxy/0.log" Mar 13 15:21:39 crc kubenswrapper[4898]: I0313 15:21:39.321245 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-m8j8d_a9193e72-6911-4df4-8b26-04b2537f68a9/nmstate-webhook/0.log" Mar 13 15:21:39 crc kubenswrapper[4898]: I0313 15:21:39.372037 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-hmwt2_84d4e279-f74c-48fd-9514-1a697341ac6a/nmstate-operator/0.log" Mar 13 15:21:41 crc kubenswrapper[4898]: I0313 15:21:41.739824 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:41 crc kubenswrapper[4898]: E0313 15:21:41.740541 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:53 crc kubenswrapper[4898]: I0313 15:21:53.739986 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:21:53 crc kubenswrapper[4898]: E0313 15:21:53.740929 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:21:55 crc kubenswrapper[4898]: I0313 15:21:55.558507 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/kube-rbac-proxy/0.log" Mar 13 15:21:55 crc kubenswrapper[4898]: I0313 15:21:55.625686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/manager/0.log" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.146434 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:22:00 crc kubenswrapper[4898]: E0313 15:22:00.148406 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" containerName="oc" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.148499 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" containerName="oc" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.148833 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" containerName="oc" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.149706 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.152777 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.152786 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.153755 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.166777 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.193867 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"auto-csr-approver-29556922-p8rbd\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.297845 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"auto-csr-approver-29556922-p8rbd\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.341591 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"auto-csr-approver-29556922-p8rbd\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:00 crc kubenswrapper[4898]: I0313 15:22:00.483579 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:01 crc kubenswrapper[4898]: I0313 15:22:01.856116 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:22:02 crc kubenswrapper[4898]: W0313 15:22:02.140006 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf39c392_cb6d_4afc_837c_9cbf245a9856.slice/crio-d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee WatchSource:0}: Error finding container d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee: Status 404 returned error can't find the container with id d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee Mar 13 15:22:02 crc kubenswrapper[4898]: I0313 15:22:02.148699 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:22:03 crc kubenswrapper[4898]: I0313 15:22:03.106443 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerStarted","Data":"d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee"} Mar 13 15:22:04 crc kubenswrapper[4898]: I0313 15:22:04.119950 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerStarted","Data":"038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587"} Mar 13 15:22:04 crc kubenswrapper[4898]: I0313 15:22:04.140073 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" podStartSLOduration=2.801319619 podStartE2EDuration="4.140050802s" podCreationTimestamp="2026-03-13 15:22:00 +0000 UTC" firstStartedPulling="2026-03-13 15:22:02.143002625 +0000 UTC m=+5157.144590864" lastFinishedPulling="2026-03-13 15:22:03.481733808 +0000 UTC m=+5158.483322047" observedRunningTime="2026-03-13 15:22:04.135741695 +0000 UTC m=+5159.137329954" watchObservedRunningTime="2026-03-13 15:22:04.140050802 +0000 UTC m=+5159.141639061" Mar 13 15:22:05 crc kubenswrapper[4898]: I0313 15:22:05.135030 4898 generic.go:334] "Generic (PLEG): container finished" podID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerID="038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587" exitCode=0 Mar 13 15:22:05 crc kubenswrapper[4898]: I0313 15:22:05.135124 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerDied","Data":"038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587"} Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.573887 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.654991 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") pod \"af39c392-cb6d-4afc-837c-9cbf245a9856\" (UID: \"af39c392-cb6d-4afc-837c-9cbf245a9856\") " Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.661090 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd" (OuterVolumeSpecName: "kube-api-access-htlcd") pod "af39c392-cb6d-4afc-837c-9cbf245a9856" (UID: "af39c392-cb6d-4afc-837c-9cbf245a9856"). InnerVolumeSpecName "kube-api-access-htlcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:22:06 crc kubenswrapper[4898]: I0313 15:22:06.758522 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htlcd\" (UniqueName: \"kubernetes.io/projected/af39c392-cb6d-4afc-837c-9cbf245a9856-kube-api-access-htlcd\") on node \"crc\" DevicePath \"\"" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.674238 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" event={"ID":"af39c392-cb6d-4afc-837c-9cbf245a9856","Type":"ContainerDied","Data":"d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee"} Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.674278 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7da9596e6b5c1a4612480823cfd1e026f0f98cfef44ea0a09c535cd59f364ee" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.674344 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556922-p8rbd" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.708416 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.721667 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556916-dvhfq"] Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.740271 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:07 crc kubenswrapper[4898]: E0313 15:22:07.740682 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:07 crc kubenswrapper[4898]: I0313 15:22:07.757159 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6b061a-b0db-4b84-bfc7-08238f699132" path="/var/lib/kubelet/pods/bb6b061a-b0db-4b84-bfc7-08238f699132/volumes" Mar 13 15:22:07 crc kubenswrapper[4898]: E0313 15:22:07.900985 4898 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf39c392_cb6d_4afc_837c_9cbf245a9856.slice\": RecentStats: unable to find data in memory cache]" Mar 13 15:22:14 crc kubenswrapper[4898]: I0313 15:22:14.806691 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-5r9gm_30c06063-b926-4f2e-b8d1-8c530cc5b0a9/prometheus-operator/0.log" Mar 13 15:22:14 crc kubenswrapper[4898]: I0313 15:22:14.921929 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_951cfcfc-3a8c-410e-a3f5-f5caa10511f5/prometheus-operator-admission-webhook/0.log" Mar 13 15:22:15 crc kubenswrapper[4898]: I0313 15:22:15.010197 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_8c190eee-747b-4a45-905c-fa0235080305/prometheus-operator-admission-webhook/0.log" Mar 13 15:22:15 crc kubenswrapper[4898]: I0313 15:22:15.979173 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/1.log" Mar 13 15:22:16 crc kubenswrapper[4898]: I0313 15:22:16.100349 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/0.log" Mar 13 15:22:16 crc kubenswrapper[4898]: I0313 15:22:16.109848 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-gpj8b_ad052248-8fcd-4ef6-9969-5023b87bbbf9/observability-ui-dashboards/0.log" Mar 13 15:22:16 crc kubenswrapper[4898]: I0313 15:22:16.284000 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-nkt76_79ead8ee-67ba-4831-b5d4-a1f128e94334/perses-operator/0.log" Mar 13 15:22:19 crc kubenswrapper[4898]: I0313 15:22:19.741539 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:19 crc kubenswrapper[4898]: E0313 15:22:19.742304 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:33 crc kubenswrapper[4898]: I0313 15:22:33.604102 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-bcn59_c5cfd1be-ede5-4678-99c5-17f232b97d81/cluster-logging-operator/0.log" Mar 13 15:22:33 crc kubenswrapper[4898]: I0313 15:22:33.743824 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:33 crc kubenswrapper[4898]: E0313 15:22:33.744096 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:34 crc kubenswrapper[4898]: I0313 15:22:34.867805 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_9c5fee8d-2246-4e34-8ddd-ce710e155d73/loki-compactor/0.log" Mar 13 15:22:34 crc kubenswrapper[4898]: I0313 15:22:34.910626 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-xcq52_824d10e9-5cdc-4dc5-b9a8-b151c779b900/collector/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.092780 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-vvj56_510657b4-32e2-4fa5-9c09-17869a295736/loki-distributor/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.147475 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-8ng9x_13ee53e6-2549-4dd8-91ac-80e4ef2c9d99/gateway/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.235445 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-8ng9x_13ee53e6-2549-4dd8-91ac-80e4ef2c9d99/opa/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.371945 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-9qh4r_077fcbe8-c497-44b4-82f9-ff8e317cbe83/opa/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.411346 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-c6d797ccf-9qh4r_077fcbe8-c497-44b4-82f9-ff8e317cbe83/gateway/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.528792 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_6a1df267-1145-4fe1-9455-57df3d043e3a/loki-index-gateway/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.685939 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_2194d847-4858-4f46-ab8b-c2d78cf5677e/loki-ingester/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.783026 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-qr6bw_5e81d88f-c63b-4f0c-ba17-f1171350c28d/loki-querier/0.log" Mar 13 15:22:35 crc kubenswrapper[4898]: I0313 15:22:35.899476 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-mwqzz_e519fed6-a687-4a01-a979-598e81122ad1/loki-query-frontend/0.log" Mar 13 15:22:45 crc kubenswrapper[4898]: I0313 15:22:45.748595 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:45 crc kubenswrapper[4898]: E0313 15:22:45.749540 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:49 crc kubenswrapper[4898]: I0313 15:22:49.010270 4898 scope.go:117] "RemoveContainer" containerID="bf083e1d2202ec3f40b443b0422cd0440a225764d3bb5e0e49d48d4861f197f0" Mar 13 15:22:53 crc kubenswrapper[4898]: I0313 15:22:53.601028 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-cx422_b231c7db-5056-4ec6-a64c-0aa8bdff336b/kube-rbac-proxy/0.log" Mar 13 15:22:53 crc kubenswrapper[4898]: I0313 15:22:53.738370 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-cx422_b231c7db-5056-4ec6-a64c-0aa8bdff336b/controller/0.log" Mar 13 15:22:53 crc kubenswrapper[4898]: I0313 15:22:53.849621 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.061782 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.067747 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.069634 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.110916 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.250388 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.274330 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.281826 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.309497 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.445847 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-frr-files/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.481316 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-reloader/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.507214 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/controller/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.532408 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/cp-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.737132 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/frr/1.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.792173 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/frr-metrics/0.log" Mar 13 15:22:54 crc kubenswrapper[4898]: I0313 15:22:54.840780 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/kube-rbac-proxy/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.032087 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/kube-rbac-proxy-frr/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.144826 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/reloader/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.286510 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-5p4w5_604b9c21-3e85-4c2e-9faf-962f44236911/frr-k8s-webhook-server/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.410729 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cf7c75c99-qxdbx_e000d86e-e7a8-49ed-9184-fdd67dfe797d/manager/1.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.515420 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-cf7c75c99-qxdbx_e000d86e-e7a8-49ed-9184-fdd67dfe797d/manager/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.692830 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-67c6f6c5cb-d26qw_34b4f98c-a87c-4a97-9ac4-286afeb9e4bc/webhook-server/0.log" Mar 13 15:22:55 crc kubenswrapper[4898]: I0313 15:22:55.873587 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g5gqr_edfd91ee-1246-43b2-84a0-95ea069de402/kube-rbac-proxy/0.log" Mar 13 15:22:56 crc kubenswrapper[4898]: I0313 15:22:56.638809 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-g5gqr_edfd91ee-1246-43b2-84a0-95ea069de402/speaker/0.log" Mar 13 15:22:56 crc kubenswrapper[4898]: I0313 15:22:56.739659 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:22:56 crc kubenswrapper[4898]: E0313 15:22:56.740022 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:22:56 crc kubenswrapper[4898]: I0313 15:22:56.858002 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-bqmxg_1a7fcb96-7168-4049-8c28-d3f740599e48/frr/0.log" Mar 13 15:23:10 crc kubenswrapper[4898]: I0313 15:23:10.740366 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:10 crc kubenswrapper[4898]: E0313 15:23:10.741147 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:23:11 crc kubenswrapper[4898]: I0313 15:23:11.982746 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.166870 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.189580 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.258210 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.439511 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.441474 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.469939 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874tx5wh_53800f20-93f5-4ab5-9feb-eb325fa0f945/extract/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.631126 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.813067 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/util/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.837327 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/pull/0.log" Mar 13 15:23:12 crc kubenswrapper[4898]: I0313 15:23:12.838930 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.017455 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.030914 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/extract/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.057259 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1k5mm8_53fae31e-a97e-443d-88c2-fa38af842855/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.225739 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.362890 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.375443 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.398579 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.582393 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.586778 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/pull/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.632267 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d54qwls_964a321b-4be6-444e-8c20-3fc586008da7/extract/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.770971 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.960535 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/util/0.log" Mar 13 15:23:13 crc kubenswrapper[4898]: I0313 15:23:13.999178 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.026495 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.211090 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/util/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.297606 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.345293 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6cstz4q_e0f45cf5-8d8f-472b-87f5-64e5c8192622/extract/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.574627 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/util/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.721624 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/util/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.726312 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.727170 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.941101 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/extract/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.944509 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/pull/0.log" Mar 13 15:23:14 crc kubenswrapper[4898]: I0313 15:23:14.952617 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0859mb4_dd46f989-e694-47a9-9b46-e96b7b47e403/util/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.160166 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.327952 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.354996 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.355070 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.578476 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.596597 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.660109 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.873836 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-utilities/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.882787 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-content/0.log" Mar 13 15:23:15 crc kubenswrapper[4898]: I0313 15:23:15.888156 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.177496 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.233566 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/extract-utilities/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.491193 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-z7ng7_b8942bb7-1cd2-49b9-8d98-5ba4c5f6c320/marketplace-operator/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.567228 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-utilities/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.720264 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hkbng_89abe4ad-dd62-4a70-a1d1-fdf97448ada5/registry-server/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.765944 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-utilities/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.825257 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.826619 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-content/0.log" Mar 13 15:23:16 crc kubenswrapper[4898]: I0313 15:23:16.872525 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fcfmz_2cd5c4b0-1bfe-479d-b800-ad19c56f3c9e/registry-server/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.025424 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.032181 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/extract-content/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.087527 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.266358 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.300765 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-content/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.303328 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zs42q_0182307e-bc7f-415e-a0f9-0eff9902384c/registry-server/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.334304 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-content/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.584261 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-utilities/0.log" Mar 13 15:23:17 crc kubenswrapper[4898]: I0313 15:23:17.608686 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/extract-content/0.log" Mar 13 15:23:18 crc kubenswrapper[4898]: I0313 15:23:18.380572 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zgjzn_cbb51f06-0778-4b18-82b5-c5ce91e0a613/registry-server/0.log" Mar 13 15:23:22 crc kubenswrapper[4898]: I0313 15:23:22.740720 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:22 crc kubenswrapper[4898]: E0313 15:23:22.743107 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:23:31 crc kubenswrapper[4898]: I0313 15:23:31.833073 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-jg5kz_951cfcfc-3a8c-410e-a3f5-f5caa10511f5/prometheus-operator-admission-webhook/0.log" Mar 13 15:23:31 crc kubenswrapper[4898]: I0313 15:23:31.842457 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-5r9gm_30c06063-b926-4f2e-b8d1-8c530cc5b0a9/prometheus-operator/0.log" Mar 13 15:23:31 crc kubenswrapper[4898]: I0313 15:23:31.877708 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7b6b5d6456-qldnm_8c190eee-747b-4a45-905c-fa0235080305/prometheus-operator-admission-webhook/0.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.036989 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/1.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.088701 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-ljrtz_3bfc0332-bb59-42bf-bb70-462efa225c81/operator/0.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.100105 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-gpj8b_ad052248-8fcd-4ef6-9969-5023b87bbbf9/observability-ui-dashboards/0.log" Mar 13 15:23:32 crc kubenswrapper[4898]: I0313 15:23:32.150604 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-nkt76_79ead8ee-67ba-4831-b5d4-a1f128e94334/perses-operator/0.log" Mar 13 15:23:36 crc kubenswrapper[4898]: I0313 15:23:36.740101 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:36 crc kubenswrapper[4898]: E0313 15:23:36.741187 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:23:47 crc kubenswrapper[4898]: I0313 15:23:47.021875 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/kube-rbac-proxy/0.log" Mar 13 15:23:47 crc kubenswrapper[4898]: I0313 15:23:47.097197 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5fb555ff84-j52b8_2cd05b5b-32da-4560-a761-72221b99e2c6/manager/0.log" Mar 13 15:23:51 crc kubenswrapper[4898]: I0313 15:23:51.740081 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:23:51 crc kubenswrapper[4898]: E0313 15:23:51.740978 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.158566 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:24:00 crc kubenswrapper[4898]: E0313 15:24:00.159629 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerName="oc" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.159645 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerName="oc" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.159871 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" containerName="oc" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.161222 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.164745 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.165042 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.165215 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.175275 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.296594 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"auto-csr-approver-29556924-vg49w\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.398733 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"auto-csr-approver-29556924-vg49w\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.448435 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"auto-csr-approver-29556924-vg49w\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:00 crc kubenswrapper[4898]: I0313 15:24:00.486918 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:01 crc kubenswrapper[4898]: I0313 15:24:01.305434 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:24:01 crc kubenswrapper[4898]: W0313 15:24:01.310505 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1a5763_109f_4888_97bb_eeb7cd25ff69.slice/crio-ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3 WatchSource:0}: Error finding container ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3: Status 404 returned error can't find the container with id ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3 Mar 13 15:24:02 crc kubenswrapper[4898]: I0313 15:24:02.039046 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerStarted","Data":"ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3"} Mar 13 15:24:03 crc kubenswrapper[4898]: I0313 15:24:03.741170 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:03 crc kubenswrapper[4898]: E0313 15:24:03.742099 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:04 crc kubenswrapper[4898]: I0313 15:24:04.064764 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerStarted","Data":"c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223"} Mar 13 15:24:04 crc kubenswrapper[4898]: I0313 15:24:04.082468 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556924-vg49w" podStartSLOduration=2.935648285 podStartE2EDuration="4.082443859s" podCreationTimestamp="2026-03-13 15:24:00 +0000 UTC" firstStartedPulling="2026-03-13 15:24:01.314270087 +0000 UTC m=+5276.315858326" lastFinishedPulling="2026-03-13 15:24:02.461065661 +0000 UTC m=+5277.462653900" observedRunningTime="2026-03-13 15:24:04.07768723 +0000 UTC m=+5279.079275479" watchObservedRunningTime="2026-03-13 15:24:04.082443859 +0000 UTC m=+5279.084032088" Mar 13 15:24:07 crc kubenswrapper[4898]: I0313 15:24:07.108726 4898 generic.go:334] "Generic (PLEG): container finished" podID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerID="c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223" exitCode=0 Mar 13 15:24:07 crc kubenswrapper[4898]: I0313 15:24:07.109239 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerDied","Data":"c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223"} Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.591294 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.689067 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") pod \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\" (UID: \"3b1a5763-109f-4888-97bb-eeb7cd25ff69\") " Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.717995 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t" (OuterVolumeSpecName: "kube-api-access-sh57t") pod "3b1a5763-109f-4888-97bb-eeb7cd25ff69" (UID: "3b1a5763-109f-4888-97bb-eeb7cd25ff69"). InnerVolumeSpecName "kube-api-access-sh57t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:24:08 crc kubenswrapper[4898]: I0313 15:24:08.791736 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh57t\" (UniqueName: \"kubernetes.io/projected/3b1a5763-109f-4888-97bb-eeb7cd25ff69-kube-api-access-sh57t\") on node \"crc\" DevicePath \"\"" Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.136403 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556924-vg49w" event={"ID":"3b1a5763-109f-4888-97bb-eeb7cd25ff69","Type":"ContainerDied","Data":"ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3"} Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.136453 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9adcc76a22529be0fdb51986d191444c5cef25a069c6809a38672b448765c3" Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.136453 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556924-vg49w" Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.222687 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.238218 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556918-gcw8c"] Mar 13 15:24:09 crc kubenswrapper[4898]: I0313 15:24:09.756714 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e19347-9341-49c0-9195-97e383796cb3" path="/var/lib/kubelet/pods/85e19347-9341-49c0-9195-97e383796cb3/volumes" Mar 13 15:24:16 crc kubenswrapper[4898]: I0313 15:24:16.739645 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:16 crc kubenswrapper[4898]: E0313 15:24:16.740548 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:29 crc kubenswrapper[4898]: I0313 15:24:29.739443 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:29 crc kubenswrapper[4898]: E0313 15:24:29.740219 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:42 crc kubenswrapper[4898]: I0313 15:24:42.740296 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:42 crc kubenswrapper[4898]: E0313 15:24:42.741216 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:24:49 crc kubenswrapper[4898]: I0313 15:24:49.197955 4898 scope.go:117] "RemoveContainer" containerID="41c4a045c361afa3d6c1c9f58f41c2c132e20bbf6ef35d1cfbb029e2412c57e3" Mar 13 15:24:49 crc kubenswrapper[4898]: I0313 15:24:49.271949 4898 scope.go:117] "RemoveContainer" containerID="2124311011b6d2dbdb56bd6bbe60e2fefb84036c80bad5c23f6cbdc48089d7e2" Mar 13 15:24:54 crc kubenswrapper[4898]: I0313 15:24:54.740342 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:24:54 crc kubenswrapper[4898]: E0313 15:24:54.741966 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:09 crc kubenswrapper[4898]: I0313 15:25:09.740777 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:09 crc kubenswrapper[4898]: E0313 15:25:09.741645 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:23 crc kubenswrapper[4898]: I0313 15:25:23.740034 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:23 crc kubenswrapper[4898]: E0313 15:25:23.740982 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:34 crc kubenswrapper[4898]: I0313 15:25:34.740599 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:34 crc kubenswrapper[4898]: E0313 15:25:34.741886 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.465544 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:41 crc kubenswrapper[4898]: E0313 15:25:41.467008 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerName="oc" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.467037 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerName="oc" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.468405 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" containerName="oc" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.476557 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.483603 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.555295 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.555416 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.555808 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658084 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658269 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658311 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658772 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.658773 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.688200 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"certified-operators-gn5g7\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:41 crc kubenswrapper[4898]: I0313 15:25:41.813227 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:42 crc kubenswrapper[4898]: I0313 15:25:42.296608 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:42 crc kubenswrapper[4898]: W0313 15:25:42.301485 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c51c71_efe8_43ff_88ed_fb12b1121692.slice/crio-fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390 WatchSource:0}: Error finding container fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390: Status 404 returned error can't find the container with id fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390 Mar 13 15:25:42 crc kubenswrapper[4898]: I0313 15:25:42.359867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerStarted","Data":"fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390"} Mar 13 15:25:43 crc kubenswrapper[4898]: I0313 15:25:43.375782 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" exitCode=0 Mar 13 15:25:43 crc kubenswrapper[4898]: I0313 15:25:43.375839 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7"} Mar 13 15:25:45 crc kubenswrapper[4898]: I0313 15:25:45.409345 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerStarted","Data":"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7"} Mar 13 15:25:46 crc kubenswrapper[4898]: I0313 15:25:46.424011 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" exitCode=0 Mar 13 15:25:46 crc kubenswrapper[4898]: I0313 15:25:46.424108 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7"} Mar 13 15:25:47 crc kubenswrapper[4898]: I0313 15:25:47.442047 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerStarted","Data":"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e"} Mar 13 15:25:47 crc kubenswrapper[4898]: I0313 15:25:47.478813 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gn5g7" podStartSLOduration=3.023432986 podStartE2EDuration="6.478787395s" podCreationTimestamp="2026-03-13 15:25:41 +0000 UTC" firstStartedPulling="2026-03-13 15:25:43.378665299 +0000 UTC m=+5378.380253538" lastFinishedPulling="2026-03-13 15:25:46.834019708 +0000 UTC m=+5381.835607947" observedRunningTime="2026-03-13 15:25:47.472046156 +0000 UTC m=+5382.473634405" watchObservedRunningTime="2026-03-13 15:25:47.478787395 +0000 UTC m=+5382.480375644" Mar 13 15:25:49 crc kubenswrapper[4898]: I0313 15:25:49.410739 4898 scope.go:117] "RemoveContainer" containerID="16ab5a558dadc1a752795f9720b3efadfbd0fc9afffa700ac13e16901fd9b881" Mar 13 15:25:49 crc kubenswrapper[4898]: I0313 15:25:49.739408 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:25:50 crc kubenswrapper[4898]: I0313 15:25:50.505667 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006"} Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.522286 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" exitCode=0 Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.522378 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-b6mrl/must-gather-cklv9" event={"ID":"9ef69d80-7edf-459b-a521-b45bc90a18df","Type":"ContainerDied","Data":"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83"} Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.523708 4898 scope.go:117] "RemoveContainer" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.814138 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.814232 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:51 crc kubenswrapper[4898]: I0313 15:25:51.885757 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.362601 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b6mrl_must-gather-cklv9_9ef69d80-7edf-459b-a521-b45bc90a18df/gather/0.log" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.424631 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.427210 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.445755 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.576754 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.576804 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.576870 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.599465 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.679361 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.679413 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.679505 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.680020 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.680083 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.701835 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"redhat-marketplace-pls2z\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:52 crc kubenswrapper[4898]: I0313 15:25:52.756135 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:25:53 crc kubenswrapper[4898]: I0313 15:25:53.334031 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:25:53 crc kubenswrapper[4898]: I0313 15:25:53.547579 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerStarted","Data":"816344487fa42199348e57ae5e56299ab0a59791c53b5816fd0a0d3e54766c35"} Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.562477 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" exitCode=0 Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.563181 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a"} Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.926024 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:54 crc kubenswrapper[4898]: I0313 15:25:54.926274 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gn5g7" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" containerID="cri-o://4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" gracePeriod=2 Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.486224 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.583756 4898 generic.go:334] "Generic (PLEG): container finished" podID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" exitCode=0 Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.583867 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e"} Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.583977 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gn5g7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.584076 4898 scope.go:117] "RemoveContainer" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.584061 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gn5g7" event={"ID":"d9c51c71-efe8-43ff-88ed-fb12b1121692","Type":"ContainerDied","Data":"fb24a79a6bf951be31d041d3c4d4e9c588aa8c8a21fd22b6d401f9f1916f5390"} Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.605252 4898 scope.go:117] "RemoveContainer" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.645155 4898 scope.go:117] "RemoveContainer" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.660202 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") pod \"d9c51c71-efe8-43ff-88ed-fb12b1121692\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.660367 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") pod \"d9c51c71-efe8-43ff-88ed-fb12b1121692\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.660425 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") pod \"d9c51c71-efe8-43ff-88ed-fb12b1121692\" (UID: \"d9c51c71-efe8-43ff-88ed-fb12b1121692\") " Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.661229 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities" (OuterVolumeSpecName: "utilities") pod "d9c51c71-efe8-43ff-88ed-fb12b1121692" (UID: "d9c51c71-efe8-43ff-88ed-fb12b1121692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.666920 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw" (OuterVolumeSpecName: "kube-api-access-qqvdw") pod "d9c51c71-efe8-43ff-88ed-fb12b1121692" (UID: "d9c51c71-efe8-43ff-88ed-fb12b1121692"). InnerVolumeSpecName "kube-api-access-qqvdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.726196 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9c51c71-efe8-43ff-88ed-fb12b1121692" (UID: "d9c51c71-efe8-43ff-88ed-fb12b1121692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.761662 4898 scope.go:117] "RemoveContainer" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" Mar 13 15:25:55 crc kubenswrapper[4898]: E0313 15:25:55.762205 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e\": container with ID starting with 4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e not found: ID does not exist" containerID="4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762247 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e"} err="failed to get container status \"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e\": rpc error: code = NotFound desc = could not find container \"4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e\": container with ID starting with 4c7f09252b43d299991670c1d10ac5f6fc47b8d7ac308b4e48773964aa6e0e5e not found: ID does not exist" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762274 4898 scope.go:117] "RemoveContainer" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" Mar 13 15:25:55 crc kubenswrapper[4898]: E0313 15:25:55.762559 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7\": container with ID starting with 549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7 not found: ID does not exist" containerID="549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762593 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7"} err="failed to get container status \"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7\": rpc error: code = NotFound desc = could not find container \"549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7\": container with ID starting with 549f23a7d00a31d7ea04ef20fbcf64834236aa87809459f07d2f89a5e0d047c7 not found: ID does not exist" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762613 4898 scope.go:117] "RemoveContainer" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" Mar 13 15:25:55 crc kubenswrapper[4898]: E0313 15:25:55.762908 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7\": container with ID starting with e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7 not found: ID does not exist" containerID="e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.762926 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7"} err="failed to get container status \"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7\": rpc error: code = NotFound desc = could not find container \"e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7\": container with ID starting with e2c92886e7f1cfdd1bddd5db247f5bde333fba025a961a77f68a3d06b731aed7 not found: ID does not exist" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.763786 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.763824 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c51c71-efe8-43ff-88ed-fb12b1121692-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.763839 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqvdw\" (UniqueName: \"kubernetes.io/projected/d9c51c71-efe8-43ff-88ed-fb12b1121692-kube-api-access-qqvdw\") on node \"crc\" DevicePath \"\"" Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.923137 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:55 crc kubenswrapper[4898]: I0313 15:25:55.947036 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gn5g7"] Mar 13 15:25:56 crc kubenswrapper[4898]: I0313 15:25:56.599318 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerStarted","Data":"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd"} Mar 13 15:25:57 crc kubenswrapper[4898]: I0313 15:25:57.615259 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" exitCode=0 Mar 13 15:25:57 crc kubenswrapper[4898]: I0313 15:25:57.615300 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd"} Mar 13 15:25:57 crc kubenswrapper[4898]: I0313 15:25:57.752697 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" path="/var/lib/kubelet/pods/d9c51c71-efe8-43ff-88ed-fb12b1121692/volumes" Mar 13 15:25:59 crc kubenswrapper[4898]: I0313 15:25:59.664234 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerStarted","Data":"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a"} Mar 13 15:25:59 crc kubenswrapper[4898]: I0313 15:25:59.698205 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pls2z" podStartSLOduration=3.785012205 podStartE2EDuration="7.698180838s" podCreationTimestamp="2026-03-13 15:25:52 +0000 UTC" firstStartedPulling="2026-03-13 15:25:54.566692732 +0000 UTC m=+5389.568280971" lastFinishedPulling="2026-03-13 15:25:58.479861365 +0000 UTC m=+5393.481449604" observedRunningTime="2026-03-13 15:25:59.689915461 +0000 UTC m=+5394.691503720" watchObservedRunningTime="2026-03-13 15:25:59.698180838 +0000 UTC m=+5394.699769077" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147036 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:26:00 crc kubenswrapper[4898]: E0313 15:26:00.147826 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-utilities" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147847 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-utilities" Mar 13 15:26:00 crc kubenswrapper[4898]: E0313 15:26:00.147882 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-content" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147888 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="extract-content" Mar 13 15:26:00 crc kubenswrapper[4898]: E0313 15:26:00.147934 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.147940 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.148200 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c51c71-efe8-43ff-88ed-fb12b1121692" containerName="registry-server" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.149162 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.153282 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.153309 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.153838 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.159257 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.289347 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"auto-csr-approver-29556926-znpr2\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.393679 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"auto-csr-approver-29556926-znpr2\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.421357 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"auto-csr-approver-29556926-znpr2\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:00 crc kubenswrapper[4898]: I0313 15:26:00.472239 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:01 crc kubenswrapper[4898]: W0313 15:26:01.027376 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcf6d966_2758_453f_9308_fd452766462b.slice/crio-a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08 WatchSource:0}: Error finding container a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08: Status 404 returned error can't find the container with id a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08 Mar 13 15:26:01 crc kubenswrapper[4898]: I0313 15:26:01.028719 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:26:01 crc kubenswrapper[4898]: I0313 15:26:01.736795 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-znpr2" event={"ID":"fcf6d966-2758-453f-9308-fd452766462b","Type":"ContainerStarted","Data":"a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08"} Mar 13 15:26:02 crc kubenswrapper[4898]: I0313 15:26:02.756460 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:02 crc kubenswrapper[4898]: I0313 15:26:02.757776 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:03 crc kubenswrapper[4898]: I0313 15:26:03.759623 4898 generic.go:334] "Generic (PLEG): container finished" podID="fcf6d966-2758-453f-9308-fd452766462b" containerID="f74ca8934196198fdc7fe5e94130ffb287ea4038d28afe421add852586aae005" exitCode=0 Mar 13 15:26:03 crc kubenswrapper[4898]: I0313 15:26:03.759956 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-znpr2" event={"ID":"fcf6d966-2758-453f-9308-fd452766462b","Type":"ContainerDied","Data":"f74ca8934196198fdc7fe5e94130ffb287ea4038d28afe421add852586aae005"} Mar 13 15:26:03 crc kubenswrapper[4898]: I0313 15:26:03.811156 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-pls2z" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" probeResult="failure" output=< Mar 13 15:26:03 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:26:03 crc kubenswrapper[4898]: > Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.000027 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.000291 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-b6mrl/must-gather-cklv9" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" containerID="cri-o://0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" gracePeriod=2 Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.013197 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-b6mrl/must-gather-cklv9"] Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.542985 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b6mrl_must-gather-cklv9_9ef69d80-7edf-459b-a521-b45bc90a18df/copy/0.log" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.546334 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.626102 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") pod \"9ef69d80-7edf-459b-a521-b45bc90a18df\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.626256 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") pod \"9ef69d80-7edf-459b-a521-b45bc90a18df\" (UID: \"9ef69d80-7edf-459b-a521-b45bc90a18df\") " Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.648728 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg" (OuterVolumeSpecName: "kube-api-access-cfhwg") pod "9ef69d80-7edf-459b-a521-b45bc90a18df" (UID: "9ef69d80-7edf-459b-a521-b45bc90a18df"). InnerVolumeSpecName "kube-api-access-cfhwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.732912 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfhwg\" (UniqueName: \"kubernetes.io/projected/9ef69d80-7edf-459b-a521-b45bc90a18df-kube-api-access-cfhwg\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.781111 4898 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-b6mrl_must-gather-cklv9_9ef69d80-7edf-459b-a521-b45bc90a18df/copy/0.log" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.782853 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" exitCode=143 Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.783098 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-b6mrl/must-gather-cklv9" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.784044 4898 scope.go:117] "RemoveContainer" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.816092 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9ef69d80-7edf-459b-a521-b45bc90a18df" (UID: "9ef69d80-7edf-459b-a521-b45bc90a18df"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.835327 4898 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9ef69d80-7edf-459b-a521-b45bc90a18df-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.872388 4898 scope.go:117] "RemoveContainer" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.958955 4898 scope.go:117] "RemoveContainer" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" Mar 13 15:26:04 crc kubenswrapper[4898]: E0313 15:26:04.962317 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb\": container with ID starting with 0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb not found: ID does not exist" containerID="0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.962354 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb"} err="failed to get container status \"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb\": rpc error: code = NotFound desc = could not find container \"0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb\": container with ID starting with 0cf38cc33a2de3371ea65a40a67cc8243723533a2159a61bfc4ad7a679d1eafb not found: ID does not exist" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.962377 4898 scope.go:117] "RemoveContainer" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:26:04 crc kubenswrapper[4898]: E0313 15:26:04.967432 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83\": container with ID starting with 2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83 not found: ID does not exist" containerID="2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83" Mar 13 15:26:04 crc kubenswrapper[4898]: I0313 15:26:04.967459 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83"} err="failed to get container status \"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83\": rpc error: code = NotFound desc = could not find container \"2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83\": container with ID starting with 2ef5e9977f8ef2b175ed18b48018fc8614ba4246f0d291287b1a2a287a12ca83 not found: ID does not exist" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.341181 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.452028 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") pod \"fcf6d966-2758-453f-9308-fd452766462b\" (UID: \"fcf6d966-2758-453f-9308-fd452766462b\") " Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.480523 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph" (OuterVolumeSpecName: "kube-api-access-hvlph") pod "fcf6d966-2758-453f-9308-fd452766462b" (UID: "fcf6d966-2758-453f-9308-fd452766462b"). InnerVolumeSpecName "kube-api-access-hvlph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.555001 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvlph\" (UniqueName: \"kubernetes.io/projected/fcf6d966-2758-453f-9308-fd452766462b-kube-api-access-hvlph\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.768672 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" path="/var/lib/kubelet/pods/9ef69d80-7edf-459b-a521-b45bc90a18df/volumes" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.798865 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556926-znpr2" event={"ID":"fcf6d966-2758-453f-9308-fd452766462b","Type":"ContainerDied","Data":"a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08"} Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.799184 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61d000ce5c31fb2ba5ceef42b97a73083635ac5a421036e8e4ac49f91eecb08" Mar 13 15:26:05 crc kubenswrapper[4898]: I0313 15:26:05.799244 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556926-znpr2" Mar 13 15:26:06 crc kubenswrapper[4898]: I0313 15:26:06.427684 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:26:06 crc kubenswrapper[4898]: I0313 15:26:06.441948 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556920-jxmxv"] Mar 13 15:26:07 crc kubenswrapper[4898]: I0313 15:26:07.752671 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6caf987f-dbe2-48d2-8138-107de40fe224" path="/var/lib/kubelet/pods/6caf987f-dbe2-48d2-8138-107de40fe224/volumes" Mar 13 15:26:12 crc kubenswrapper[4898]: I0313 15:26:12.806668 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:12 crc kubenswrapper[4898]: I0313 15:26:12.860920 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:13 crc kubenswrapper[4898]: I0313 15:26:13.052649 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:26:13 crc kubenswrapper[4898]: I0313 15:26:13.886078 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pls2z" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" containerID="cri-o://4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" gracePeriod=2 Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.482616 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.592303 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") pod \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.593119 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities" (OuterVolumeSpecName: "utilities") pod "5b7976f2-bd4a-4ee5-983a-d1a70216276a" (UID: "5b7976f2-bd4a-4ee5-983a-d1a70216276a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.593338 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") pod \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.607265 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") pod \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\" (UID: \"5b7976f2-bd4a-4ee5-983a-d1a70216276a\") " Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.608339 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.618374 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr" (OuterVolumeSpecName: "kube-api-access-k7zxr") pod "5b7976f2-bd4a-4ee5-983a-d1a70216276a" (UID: "5b7976f2-bd4a-4ee5-983a-d1a70216276a"). InnerVolumeSpecName "kube-api-access-k7zxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.626096 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b7976f2-bd4a-4ee5-983a-d1a70216276a" (UID: "5b7976f2-bd4a-4ee5-983a-d1a70216276a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.710554 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b7976f2-bd4a-4ee5-983a-d1a70216276a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.710807 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7zxr\" (UniqueName: \"kubernetes.io/projected/5b7976f2-bd4a-4ee5-983a-d1a70216276a-kube-api-access-k7zxr\") on node \"crc\" DevicePath \"\"" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910672 4898 generic.go:334] "Generic (PLEG): container finished" podID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" exitCode=0 Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910723 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a"} Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910748 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pls2z" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910754 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pls2z" event={"ID":"5b7976f2-bd4a-4ee5-983a-d1a70216276a","Type":"ContainerDied","Data":"816344487fa42199348e57ae5e56299ab0a59791c53b5816fd0a0d3e54766c35"} Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.910782 4898 scope.go:117] "RemoveContainer" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.939851 4898 scope.go:117] "RemoveContainer" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.957672 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.968023 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pls2z"] Mar 13 15:26:14 crc kubenswrapper[4898]: I0313 15:26:14.994070 4898 scope.go:117] "RemoveContainer" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.049385 4898 scope.go:117] "RemoveContainer" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" Mar 13 15:26:15 crc kubenswrapper[4898]: E0313 15:26:15.050066 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a\": container with ID starting with 4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a not found: ID does not exist" containerID="4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050097 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a"} err="failed to get container status \"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a\": rpc error: code = NotFound desc = could not find container \"4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a\": container with ID starting with 4ae368c42be13bda887861e91de282ddb8e20b18c6dba5950a3ac8b4ed0d117a not found: ID does not exist" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050119 4898 scope.go:117] "RemoveContainer" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" Mar 13 15:26:15 crc kubenswrapper[4898]: E0313 15:26:15.050871 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd\": container with ID starting with 5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd not found: ID does not exist" containerID="5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050888 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd"} err="failed to get container status \"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd\": rpc error: code = NotFound desc = could not find container \"5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd\": container with ID starting with 5ecab09bd2ab7b4b32fbe2e46ce657e55527cc6849c9ae5279f6c7c9af36dddd not found: ID does not exist" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.050933 4898 scope.go:117] "RemoveContainer" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" Mar 13 15:26:15 crc kubenswrapper[4898]: E0313 15:26:15.051268 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a\": container with ID starting with 3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a not found: ID does not exist" containerID="3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.051307 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a"} err="failed to get container status \"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a\": rpc error: code = NotFound desc = could not find container \"3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a\": container with ID starting with 3c65064ffc6c2566c5d496a3c0e222de310ed5233942c93a5f1d36321884cd5a not found: ID does not exist" Mar 13 15:26:15 crc kubenswrapper[4898]: I0313 15:26:15.761409 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" path="/var/lib/kubelet/pods/5b7976f2-bd4a-4ee5-983a-d1a70216276a/volumes" Mar 13 15:26:49 crc kubenswrapper[4898]: I0313 15:26:49.515662 4898 scope.go:117] "RemoveContainer" containerID="897e6256c632c42295b8912f57e8c6461493cccdecec91e50f43154fdcb913e4" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.305295 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306225 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-utilities" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306238 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-utilities" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306255 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="gather" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306261 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="gather" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306276 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306282 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306292 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf6d966-2758-453f-9308-fd452766462b" containerName="oc" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306298 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf6d966-2758-453f-9308-fd452766462b" containerName="oc" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306335 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-content" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306340 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="extract-content" Mar 13 15:27:17 crc kubenswrapper[4898]: E0313 15:27:17.306360 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306366 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306607 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b7976f2-bd4a-4ee5-983a-d1a70216276a" containerName="registry-server" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306625 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="gather" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306637 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf6d966-2758-453f-9308-fd452766462b" containerName="oc" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.306648 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef69d80-7edf-459b-a521-b45bc90a18df" containerName="copy" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.308306 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.324006 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.450599 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.450642 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.450667 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554303 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554350 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554379 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554787 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.554982 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.579194 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"community-operators-2tg8f\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:17 crc kubenswrapper[4898]: I0313 15:27:17.630407 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:18 crc kubenswrapper[4898]: W0313 15:27:18.075083 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7115acf9_09f0_41e8_995a_d6179b077f37.slice/crio-e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818 WatchSource:0}: Error finding container e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818: Status 404 returned error can't find the container with id e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818 Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.087509 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.739225 4898 generic.go:334] "Generic (PLEG): container finished" podID="7115acf9-09f0-41e8-995a-d6179b077f37" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" exitCode=0 Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.739331 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57"} Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.739566 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerStarted","Data":"e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818"} Mar 13 15:27:18 crc kubenswrapper[4898]: I0313 15:27:18.741839 4898 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 15:27:20 crc kubenswrapper[4898]: I0313 15:27:20.765912 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerStarted","Data":"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188"} Mar 13 15:27:23 crc kubenswrapper[4898]: I0313 15:27:23.803783 4898 generic.go:334] "Generic (PLEG): container finished" podID="7115acf9-09f0-41e8-995a-d6179b077f37" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" exitCode=0 Mar 13 15:27:23 crc kubenswrapper[4898]: I0313 15:27:23.803830 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188"} Mar 13 15:27:24 crc kubenswrapper[4898]: I0313 15:27:24.825662 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerStarted","Data":"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165"} Mar 13 15:27:24 crc kubenswrapper[4898]: I0313 15:27:24.865340 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2tg8f" podStartSLOduration=2.126632061 podStartE2EDuration="7.865317974s" podCreationTimestamp="2026-03-13 15:27:17 +0000 UTC" firstStartedPulling="2026-03-13 15:27:18.741632117 +0000 UTC m=+5473.743220356" lastFinishedPulling="2026-03-13 15:27:24.48031803 +0000 UTC m=+5479.481906269" observedRunningTime="2026-03-13 15:27:24.854892763 +0000 UTC m=+5479.856481032" watchObservedRunningTime="2026-03-13 15:27:24.865317974 +0000 UTC m=+5479.866906223" Mar 13 15:27:27 crc kubenswrapper[4898]: I0313 15:27:27.630893 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:27 crc kubenswrapper[4898]: I0313 15:27:27.633315 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:28 crc kubenswrapper[4898]: I0313 15:27:28.976190 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-2tg8f" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" probeResult="failure" output=< Mar 13 15:27:28 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:27:28 crc kubenswrapper[4898]: > Mar 13 15:27:37 crc kubenswrapper[4898]: I0313 15:27:37.707493 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:37 crc kubenswrapper[4898]: I0313 15:27:37.786006 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:37 crc kubenswrapper[4898]: I0313 15:27:37.971822 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:38 crc kubenswrapper[4898]: I0313 15:27:38.996292 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2tg8f" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" containerID="cri-o://446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" gracePeriod=2 Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.518420 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.610782 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") pod \"7115acf9-09f0-41e8-995a-d6179b077f37\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.611702 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") pod \"7115acf9-09f0-41e8-995a-d6179b077f37\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.611750 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") pod \"7115acf9-09f0-41e8-995a-d6179b077f37\" (UID: \"7115acf9-09f0-41e8-995a-d6179b077f37\") " Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.612746 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities" (OuterVolumeSpecName: "utilities") pod "7115acf9-09f0-41e8-995a-d6179b077f37" (UID: "7115acf9-09f0-41e8-995a-d6179b077f37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.619203 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z" (OuterVolumeSpecName: "kube-api-access-x8j8z") pod "7115acf9-09f0-41e8-995a-d6179b077f37" (UID: "7115acf9-09f0-41e8-995a-d6179b077f37"). InnerVolumeSpecName "kube-api-access-x8j8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.665073 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7115acf9-09f0-41e8-995a-d6179b077f37" (UID: "7115acf9-09f0-41e8-995a-d6179b077f37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.715076 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.715604 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8j8z\" (UniqueName: \"kubernetes.io/projected/7115acf9-09f0-41e8-995a-d6179b077f37-kube-api-access-x8j8z\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:39 crc kubenswrapper[4898]: I0313 15:27:39.715678 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7115acf9-09f0-41e8-995a-d6179b077f37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016104 4898 generic.go:334] "Generic (PLEG): container finished" podID="7115acf9-09f0-41e8-995a-d6179b077f37" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" exitCode=0 Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016199 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165"} Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016253 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tg8f" event={"ID":"7115acf9-09f0-41e8-995a-d6179b077f37","Type":"ContainerDied","Data":"e72ff055139d18ced04dc4a34b3ef184c3ee1412b5fe2f74149d60e4602be818"} Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016291 4898 scope.go:117] "RemoveContainer" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.016432 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tg8f" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.047652 4898 scope.go:117] "RemoveContainer" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.065074 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.075038 4898 scope.go:117] "RemoveContainer" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.087559 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2tg8f"] Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.138094 4898 scope.go:117] "RemoveContainer" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" Mar 13 15:27:40 crc kubenswrapper[4898]: E0313 15:27:40.139710 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165\": container with ID starting with 446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165 not found: ID does not exist" containerID="446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.139743 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165"} err="failed to get container status \"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165\": rpc error: code = NotFound desc = could not find container \"446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165\": container with ID starting with 446a41b05d145e4e858c2243a1f4e5bd6917418930f03e0e56941e08e147a165 not found: ID does not exist" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.139783 4898 scope.go:117] "RemoveContainer" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" Mar 13 15:27:40 crc kubenswrapper[4898]: E0313 15:27:40.140444 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188\": container with ID starting with f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188 not found: ID does not exist" containerID="f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.140505 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188"} err="failed to get container status \"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188\": rpc error: code = NotFound desc = could not find container \"f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188\": container with ID starting with f12cd3290c44cc1be0f9eb2d7479c249511acdb1fcd8803dde3c5b2a04e04188 not found: ID does not exist" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.140540 4898 scope.go:117] "RemoveContainer" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" Mar 13 15:27:40 crc kubenswrapper[4898]: E0313 15:27:40.140877 4898 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57\": container with ID starting with 0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57 not found: ID does not exist" containerID="0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57" Mar 13 15:27:40 crc kubenswrapper[4898]: I0313 15:27:40.140933 4898 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57"} err="failed to get container status \"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57\": rpc error: code = NotFound desc = could not find container \"0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57\": container with ID starting with 0666265fc78810dbefb33bbd5683f8cd5f4ecd31ff868744637cab3d9d6e5c57 not found: ID does not exist" Mar 13 15:27:41 crc kubenswrapper[4898]: I0313 15:27:41.755078 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" path="/var/lib/kubelet/pods/7115acf9-09f0-41e8-995a-d6179b077f37/volumes" Mar 13 15:27:49 crc kubenswrapper[4898]: I0313 15:27:49.134978 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:27:49 crc kubenswrapper[4898]: I0313 15:27:49.135499 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.228792 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:27:57 crc kubenswrapper[4898]: E0313 15:27:57.229745 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.229757 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" Mar 13 15:27:57 crc kubenswrapper[4898]: E0313 15:27:57.229792 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-content" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.229798 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-content" Mar 13 15:27:57 crc kubenswrapper[4898]: E0313 15:27:57.229813 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-utilities" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.229819 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="extract-utilities" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.230077 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="7115acf9-09f0-41e8-995a-d6179b077f37" containerName="registry-server" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.231688 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.248152 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.365831 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.366290 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.366419 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.468574 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469015 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469185 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469619 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.469626 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.487279 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"redhat-operators-dbz9v\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:57 crc kubenswrapper[4898]: I0313 15:27:57.564516 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:27:58 crc kubenswrapper[4898]: I0313 15:27:58.077129 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:27:58 crc kubenswrapper[4898]: I0313 15:27:58.228510 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerStarted","Data":"dfcf7ad7032f43188bd7aba9617ccc0074fb1fd0e3010930b63236240ebde712"} Mar 13 15:27:59 crc kubenswrapper[4898]: I0313 15:27:59.240847 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerID="b1bdcb936ed1f82a907dadb3f319631253f8ad7fb9f25e178be06f7c87b0893e" exitCode=0 Mar 13 15:27:59 crc kubenswrapper[4898]: I0313 15:27:59.241004 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"b1bdcb936ed1f82a907dadb3f319631253f8ad7fb9f25e178be06f7c87b0893e"} Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.146345 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556928-wqxh8"] Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.148524 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.151301 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.152226 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.154971 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.167448 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-wqxh8"] Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.239501 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"auto-csr-approver-29556928-wqxh8\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.341535 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"auto-csr-approver-29556928-wqxh8\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.365727 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"auto-csr-approver-29556928-wqxh8\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:00 crc kubenswrapper[4898]: I0313 15:28:00.477176 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:01 crc kubenswrapper[4898]: I0313 15:28:01.028278 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556928-wqxh8"] Mar 13 15:28:01 crc kubenswrapper[4898]: W0313 15:28:01.030450 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff3a636_04a3_4ec5_8cd2_da3adf44d084.slice/crio-08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef WatchSource:0}: Error finding container 08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef: Status 404 returned error can't find the container with id 08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef Mar 13 15:28:01 crc kubenswrapper[4898]: I0313 15:28:01.266024 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" event={"ID":"9ff3a636-04a3-4ec5-8cd2-da3adf44d084","Type":"ContainerStarted","Data":"08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef"} Mar 13 15:28:01 crc kubenswrapper[4898]: I0313 15:28:01.268552 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerStarted","Data":"b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af"} Mar 13 15:28:03 crc kubenswrapper[4898]: I0313 15:28:03.295974 4898 generic.go:334] "Generic (PLEG): container finished" podID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerID="64afab621e156319ee19f6016510705ea0ece1835f0875cb7b018c10679eac40" exitCode=0 Mar 13 15:28:03 crc kubenswrapper[4898]: I0313 15:28:03.296379 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" event={"ID":"9ff3a636-04a3-4ec5-8cd2-da3adf44d084","Type":"ContainerDied","Data":"64afab621e156319ee19f6016510705ea0ece1835f0875cb7b018c10679eac40"} Mar 13 15:28:04 crc kubenswrapper[4898]: I0313 15:28:04.868012 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:04 crc kubenswrapper[4898]: I0313 15:28:04.971768 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") pod \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\" (UID: \"9ff3a636-04a3-4ec5-8cd2-da3adf44d084\") " Mar 13 15:28:04 crc kubenswrapper[4898]: I0313 15:28:04.980295 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x" (OuterVolumeSpecName: "kube-api-access-sj97x") pod "9ff3a636-04a3-4ec5-8cd2-da3adf44d084" (UID: "9ff3a636-04a3-4ec5-8cd2-da3adf44d084"). InnerVolumeSpecName "kube-api-access-sj97x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.075569 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj97x\" (UniqueName: \"kubernetes.io/projected/9ff3a636-04a3-4ec5-8cd2-da3adf44d084-kube-api-access-sj97x\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.335132 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" event={"ID":"9ff3a636-04a3-4ec5-8cd2-da3adf44d084","Type":"ContainerDied","Data":"08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef"} Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.335170 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e435b1c49eb97b8b6176e1d1c11c3f78d0f623f2acc8d03a798cc850afbfef" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.335205 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556928-wqxh8" Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.943165 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:28:05 crc kubenswrapper[4898]: I0313 15:28:05.954600 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556922-p8rbd"] Mar 13 15:28:07 crc kubenswrapper[4898]: I0313 15:28:07.751546 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af39c392-cb6d-4afc-837c-9cbf245a9856" path="/var/lib/kubelet/pods/af39c392-cb6d-4afc-837c-9cbf245a9856/volumes" Mar 13 15:28:08 crc kubenswrapper[4898]: I0313 15:28:08.375576 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerID="b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af" exitCode=0 Mar 13 15:28:08 crc kubenswrapper[4898]: I0313 15:28:08.375660 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af"} Mar 13 15:28:10 crc kubenswrapper[4898]: I0313 15:28:10.399518 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerStarted","Data":"501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186"} Mar 13 15:28:10 crc kubenswrapper[4898]: I0313 15:28:10.428943 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dbz9v" podStartSLOduration=3.545501042 podStartE2EDuration="13.428911525s" podCreationTimestamp="2026-03-13 15:27:57 +0000 UTC" firstStartedPulling="2026-03-13 15:27:59.243574369 +0000 UTC m=+5514.245162608" lastFinishedPulling="2026-03-13 15:28:09.126984852 +0000 UTC m=+5524.128573091" observedRunningTime="2026-03-13 15:28:10.418841563 +0000 UTC m=+5525.420429802" watchObservedRunningTime="2026-03-13 15:28:10.428911525 +0000 UTC m=+5525.430499764" Mar 13 15:28:17 crc kubenswrapper[4898]: I0313 15:28:17.565521 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:17 crc kubenswrapper[4898]: I0313 15:28:17.566161 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:18 crc kubenswrapper[4898]: I0313 15:28:18.661969 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:28:18 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:28:18 crc kubenswrapper[4898]: > Mar 13 15:28:19 crc kubenswrapper[4898]: I0313 15:28:19.134237 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:28:19 crc kubenswrapper[4898]: I0313 15:28:19.134722 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:28:28 crc kubenswrapper[4898]: I0313 15:28:28.627033 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:28:28 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:28:28 crc kubenswrapper[4898]: > Mar 13 15:28:38 crc kubenswrapper[4898]: I0313 15:28:38.624822 4898 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" probeResult="failure" output=< Mar 13 15:28:38 crc kubenswrapper[4898]: timeout: failed to connect service ":50051" within 1s Mar 13 15:28:38 crc kubenswrapper[4898]: > Mar 13 15:28:47 crc kubenswrapper[4898]: I0313 15:28:47.636050 4898 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:47 crc kubenswrapper[4898]: I0313 15:28:47.699004 4898 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.134381 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.134687 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.134741 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.135725 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.135795 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006" gracePeriod=600 Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.695834 4898 scope.go:117] "RemoveContainer" containerID="038b2fc06591bca90016e0206f4da45f7623dddf7ed529280975231bc7adf587" Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.858891 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006" exitCode=0 Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.858931 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006"} Mar 13 15:28:49 crc kubenswrapper[4898]: I0313 15:28:49.858984 4898 scope.go:117] "RemoveContainer" containerID="e40bd227e881ba459a425e50fe1c2e2837377f3121675fc9f55de4fe34577668" Mar 13 15:28:50 crc kubenswrapper[4898]: I0313 15:28:50.872786 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerStarted","Data":"cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7"} Mar 13 15:28:52 crc kubenswrapper[4898]: I0313 15:28:52.661761 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:28:52 crc kubenswrapper[4898]: I0313 15:28:52.663434 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dbz9v" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" containerID="cri-o://501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186" gracePeriod=2 Mar 13 15:28:53 crc kubenswrapper[4898]: I0313 15:28:53.909834 4898 generic.go:334] "Generic (PLEG): container finished" podID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerID="501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186" exitCode=0 Mar 13 15:28:53 crc kubenswrapper[4898]: I0313 15:28:53.909974 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186"} Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.766210 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.917870 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") pod \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.918695 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities" (OuterVolumeSpecName: "utilities") pod "eb8014b4-2211-4ce2-93ec-3a496a563a8c" (UID: "eb8014b4-2211-4ce2-93ec-3a496a563a8c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.919046 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") pod \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.920136 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") pod \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\" (UID: \"eb8014b4-2211-4ce2-93ec-3a496a563a8c\") " Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.921723 4898 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.923951 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbz9v" event={"ID":"eb8014b4-2211-4ce2-93ec-3a496a563a8c","Type":"ContainerDied","Data":"dfcf7ad7032f43188bd7aba9617ccc0074fb1fd0e3010930b63236240ebde712"} Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.924003 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbz9v" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.924066 4898 scope.go:117] "RemoveContainer" containerID="501957783fc2b9643d6e59dc7c7a2674654e9caa4d3ce18aa694243b54940186" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.936446 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd" (OuterVolumeSpecName: "kube-api-access-7trwd") pod "eb8014b4-2211-4ce2-93ec-3a496a563a8c" (UID: "eb8014b4-2211-4ce2-93ec-3a496a563a8c"). InnerVolumeSpecName "kube-api-access-7trwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:28:54 crc kubenswrapper[4898]: I0313 15:28:54.946618 4898 scope.go:117] "RemoveContainer" containerID="b15545ebc323988813650859c1417c533406bfededaf18c21507a8ec9723c2af" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.003446 4898 scope.go:117] "RemoveContainer" containerID="b1bdcb936ed1f82a907dadb3f319631253f8ad7fb9f25e178be06f7c87b0893e" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.025279 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7trwd\" (UniqueName: \"kubernetes.io/projected/eb8014b4-2211-4ce2-93ec-3a496a563a8c-kube-api-access-7trwd\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.068362 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb8014b4-2211-4ce2-93ec-3a496a563a8c" (UID: "eb8014b4-2211-4ce2-93ec-3a496a563a8c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.128459 4898 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb8014b4-2211-4ce2-93ec-3a496a563a8c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.270429 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.283580 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dbz9v"] Mar 13 15:28:55 crc kubenswrapper[4898]: I0313 15:28:55.753158 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" path="/var/lib/kubelet/pods/eb8014b4-2211-4ce2-93ec-3a496a563a8c/volumes" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.151869 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556930-f8ms7"] Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153076 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-content" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153095 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-content" Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153128 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153137 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153165 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerName="oc" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153174 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerName="oc" Mar 13 15:30:00 crc kubenswrapper[4898]: E0313 15:30:00.153190 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-utilities" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153198 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="extract-utilities" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153575 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb8014b4-2211-4ce2-93ec-3a496a563a8c" containerName="registry-server" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.153632 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ff3a636-04a3-4ec5-8cd2-da3adf44d084" containerName="oc" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.157279 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.159910 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.160060 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.160199 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.167659 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f"] Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.170495 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.174363 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.174537 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.182157 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-f8ms7"] Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.190067 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f"] Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.249699 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"auto-csr-approver-29556930-f8ms7\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351386 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"auto-csr-approver-29556930-f8ms7\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351517 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351567 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.351691 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.377015 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"auto-csr-approver-29556930-f8ms7\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.454417 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.454655 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.454720 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.455246 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.459961 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.470317 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"collect-profiles-29556930-z4l5f\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.484266 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:00 crc kubenswrapper[4898]: I0313 15:30:00.495420 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.014244 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556930-f8ms7"] Mar 13 15:30:01 crc kubenswrapper[4898]: W0313 15:30:01.014806 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e4bb07_0526_42fe_80c5_6bed7db79d16.slice/crio-85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341 WatchSource:0}: Error finding container 85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341: Status 404 returned error can't find the container with id 85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341 Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.028505 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f"] Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.786244 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerStarted","Data":"a25345c405dfefa76e519a8784a708d6344c4490388c27312a458e7e659dc8c9"} Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.786525 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerStarted","Data":"85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341"} Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.787987 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerStarted","Data":"cd8ee8aac0748f9e11c55221957bf5b61afe83bbf5d7802c2716a842c4b3f9e7"} Mar 13 15:30:01 crc kubenswrapper[4898]: I0313 15:30:01.816139 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" podStartSLOduration=1.816117204 podStartE2EDuration="1.816117204s" podCreationTimestamp="2026-03-13 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 15:30:01.806564856 +0000 UTC m=+5636.808153095" watchObservedRunningTime="2026-03-13 15:30:01.816117204 +0000 UTC m=+5636.817705463" Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.802041 4898 generic.go:334] "Generic (PLEG): container finished" podID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerID="a25345c405dfefa76e519a8784a708d6344c4490388c27312a458e7e659dc8c9" exitCode=0 Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.802243 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerDied","Data":"a25345c405dfefa76e519a8784a708d6344c4490388c27312a458e7e659dc8c9"} Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.804356 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerStarted","Data":"b77b5806113bddb5a814673aafb563d048919833f2e203dc225d6b5382a9eff2"} Mar 13 15:30:02 crc kubenswrapper[4898]: I0313 15:30:02.849126 4898 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" podStartSLOduration=1.605041651 podStartE2EDuration="2.849104505s" podCreationTimestamp="2026-03-13 15:30:00 +0000 UTC" firstStartedPulling="2026-03-13 15:30:01.015338931 +0000 UTC m=+5636.016927170" lastFinishedPulling="2026-03-13 15:30:02.259401785 +0000 UTC m=+5637.260990024" observedRunningTime="2026-03-13 15:30:02.838396888 +0000 UTC m=+5637.839985177" watchObservedRunningTime="2026-03-13 15:30:02.849104505 +0000 UTC m=+5637.850692744" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.242244 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.349442 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") pod \"11e4bb07-0526-42fe-80c5-6bed7db79d16\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.349494 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") pod \"11e4bb07-0526-42fe-80c5-6bed7db79d16\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.349607 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") pod \"11e4bb07-0526-42fe-80c5-6bed7db79d16\" (UID: \"11e4bb07-0526-42fe-80c5-6bed7db79d16\") " Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.351164 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume" (OuterVolumeSpecName: "config-volume") pod "11e4bb07-0526-42fe-80c5-6bed7db79d16" (UID: "11e4bb07-0526-42fe-80c5-6bed7db79d16"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.356959 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11e4bb07-0526-42fe-80c5-6bed7db79d16" (UID: "11e4bb07-0526-42fe-80c5-6bed7db79d16"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.358254 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx" (OuterVolumeSpecName: "kube-api-access-qwckx") pod "11e4bb07-0526-42fe-80c5-6bed7db79d16" (UID: "11e4bb07-0526-42fe-80c5-6bed7db79d16"). InnerVolumeSpecName "kube-api-access-qwckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.452974 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwckx\" (UniqueName: \"kubernetes.io/projected/11e4bb07-0526-42fe-80c5-6bed7db79d16-kube-api-access-qwckx\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.453011 4898 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11e4bb07-0526-42fe-80c5-6bed7db79d16-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.453021 4898 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11e4bb07-0526-42fe-80c5-6bed7db79d16-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.830813 4898 generic.go:334] "Generic (PLEG): container finished" podID="98cb3d53-de77-4344-8045-41653ba912a9" containerID="b77b5806113bddb5a814673aafb563d048919833f2e203dc225d6b5382a9eff2" exitCode=0 Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.830870 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerDied","Data":"b77b5806113bddb5a814673aafb563d048919833f2e203dc225d6b5382a9eff2"} Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.841329 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" event={"ID":"11e4bb07-0526-42fe-80c5-6bed7db79d16","Type":"ContainerDied","Data":"85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341"} Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.841593 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85621c637bdcd5e55220cee87918a755a34817d2327b4fcbb1244b7a5704b341" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.841389 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29556930-z4l5f" Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.918415 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 15:30:04 crc kubenswrapper[4898]: I0313 15:30:04.932522 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29556885-ncmr6"] Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.246642 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1711d9ce-262c-4c6c-930a-4148e62fae9e" path="/var/lib/kubelet/pods/1711d9ce-262c-4c6c-930a-4148e62fae9e/volumes" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.639839 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.789073 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") pod \"98cb3d53-de77-4344-8045-41653ba912a9\" (UID: \"98cb3d53-de77-4344-8045-41653ba912a9\") " Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.798121 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69" (OuterVolumeSpecName: "kube-api-access-bqq69") pod "98cb3d53-de77-4344-8045-41653ba912a9" (UID: "98cb3d53-de77-4344-8045-41653ba912a9"). InnerVolumeSpecName "kube-api-access-bqq69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.892348 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqq69\" (UniqueName: \"kubernetes.io/projected/98cb3d53-de77-4344-8045-41653ba912a9-kube-api-access-bqq69\") on node \"crc\" DevicePath \"\"" Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.913489 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:30:06 crc kubenswrapper[4898]: I0313 15:30:06.926376 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556924-vg49w"] Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.219296 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" event={"ID":"98cb3d53-de77-4344-8045-41653ba912a9","Type":"ContainerDied","Data":"cd8ee8aac0748f9e11c55221957bf5b61afe83bbf5d7802c2716a842c4b3f9e7"} Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.219344 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556930-f8ms7" Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.219363 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd8ee8aac0748f9e11c55221957bf5b61afe83bbf5d7802c2716a842c4b3f9e7" Mar 13 15:30:07 crc kubenswrapper[4898]: I0313 15:30:07.765467 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1a5763-109f-4888-97bb-eeb7cd25ff69" path="/var/lib/kubelet/pods/3b1a5763-109f-4888-97bb-eeb7cd25ff69/volumes" Mar 13 15:30:49 crc kubenswrapper[4898]: I0313 15:30:49.955039 4898 scope.go:117] "RemoveContainer" containerID="c8ba8026ba786ec97dc1c956429b165e911caa22ae98facccf3eabf821d09223" Mar 13 15:30:50 crc kubenswrapper[4898]: I0313 15:30:50.019111 4898 scope.go:117] "RemoveContainer" containerID="586dd830bc412bf8d165f328ec0120d6ccafcd1b6e8c6a0642a7f4464c15681b" Mar 13 15:31:19 crc kubenswrapper[4898]: I0313 15:31:19.134259 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:31:19 crc kubenswrapper[4898]: I0313 15:31:19.134778 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:31:49 crc kubenswrapper[4898]: I0313 15:31:49.134947 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:31:49 crc kubenswrapper[4898]: I0313 15:31:49.136546 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.156960 4898 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29556932-tq9z4"] Mar 13 15:32:00 crc kubenswrapper[4898]: E0313 15:32:00.158572 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerName="collect-profiles" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.158595 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerName="collect-profiles" Mar 13 15:32:00 crc kubenswrapper[4898]: E0313 15:32:00.158685 4898 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98cb3d53-de77-4344-8045-41653ba912a9" containerName="oc" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.158695 4898 state_mem.go:107] "Deleted CPUSet assignment" podUID="98cb3d53-de77-4344-8045-41653ba912a9" containerName="oc" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.158982 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e4bb07-0526-42fe-80c5-6bed7db79d16" containerName="collect-profiles" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.159013 4898 memory_manager.go:354] "RemoveStaleState removing state" podUID="98cb3d53-de77-4344-8045-41653ba912a9" containerName="oc" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.160304 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.162323 4898 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k7bps" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.164482 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.164723 4898 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.187730 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-tq9z4"] Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.209832 4898 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"auto-csr-approver-29556932-tq9z4\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.312305 4898 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"auto-csr-approver-29556932-tq9z4\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.332500 4898 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"auto-csr-approver-29556932-tq9z4\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.483703 4898 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:00 crc kubenswrapper[4898]: I0313 15:32:00.980256 4898 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29556932-tq9z4"] Mar 13 15:32:00 crc kubenswrapper[4898]: W0313 15:32:00.986090 4898 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9389a03_1af1_48d5_b5d3_0d8e886d5469.slice/crio-f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc WatchSource:0}: Error finding container f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc: Status 404 returned error can't find the container with id f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc Mar 13 15:32:01 crc kubenswrapper[4898]: I0313 15:32:01.635057 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" event={"ID":"b9389a03-1af1-48d5-b5d3-0d8e886d5469","Type":"ContainerStarted","Data":"f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc"} Mar 13 15:32:03 crc kubenswrapper[4898]: I0313 15:32:03.672094 4898 generic.go:334] "Generic (PLEG): container finished" podID="b9389a03-1af1-48d5-b5d3-0d8e886d5469" containerID="35e5bf6a54e528120937c5fa4cd67e6484f1ca2384f69de7e5c4a635b6cdfdc6" exitCode=0 Mar 13 15:32:03 crc kubenswrapper[4898]: I0313 15:32:03.672195 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" event={"ID":"b9389a03-1af1-48d5-b5d3-0d8e886d5469","Type":"ContainerDied","Data":"35e5bf6a54e528120937c5fa4cd67e6484f1ca2384f69de7e5c4a635b6cdfdc6"} Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.316298 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.440040 4898 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") pod \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\" (UID: \"b9389a03-1af1-48d5-b5d3-0d8e886d5469\") " Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.457970 4898 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj" (OuterVolumeSpecName: "kube-api-access-mlrpj") pod "b9389a03-1af1-48d5-b5d3-0d8e886d5469" (UID: "b9389a03-1af1-48d5-b5d3-0d8e886d5469"). InnerVolumeSpecName "kube-api-access-mlrpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.542762 4898 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrpj\" (UniqueName: \"kubernetes.io/projected/b9389a03-1af1-48d5-b5d3-0d8e886d5469-kube-api-access-mlrpj\") on node \"crc\" DevicePath \"\"" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.702992 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" event={"ID":"b9389a03-1af1-48d5-b5d3-0d8e886d5469","Type":"ContainerDied","Data":"f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc"} Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.703331 4898 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d613b9130da4d268b8f6569b6227ccc4f9cce4fee9d288aed9dcd36b5f67dc" Mar 13 15:32:05 crc kubenswrapper[4898]: I0313 15:32:05.703233 4898 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29556932-tq9z4" Mar 13 15:32:06 crc kubenswrapper[4898]: I0313 15:32:06.408122 4898 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:32:06 crc kubenswrapper[4898]: I0313 15:32:06.421810 4898 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29556926-znpr2"] Mar 13 15:32:07 crc kubenswrapper[4898]: I0313 15:32:07.753228 4898 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf6d966-2758-453f-9308-fd452766462b" path="/var/lib/kubelet/pods/fcf6d966-2758-453f-9308-fd452766462b/volumes" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.134511 4898 patch_prober.go:28] interesting pod/machine-config-daemon-8k6xj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.136638 4898 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.136949 4898 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.138647 4898 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7"} pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.138970 4898 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerName="machine-config-daemon" containerID="cri-o://cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" gracePeriod=600 Mar 13 15:32:19 crc kubenswrapper[4898]: E0313 15:32:19.274283 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894012 4898 generic.go:334] "Generic (PLEG): container finished" podID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" containerID="cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" exitCode=0 Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894064 4898 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" event={"ID":"767eecef-3bc9-4db4-a0cb-5d9c8554c62d","Type":"ContainerDied","Data":"cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7"} Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894102 4898 scope.go:117] "RemoveContainer" containerID="3f1eb02ceee77301060a512e8cdb70aa1cab3b74525898dd1df91250d09e1006" Mar 13 15:32:19 crc kubenswrapper[4898]: I0313 15:32:19.894962 4898 scope.go:117] "RemoveContainer" containerID="cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" Mar 13 15:32:19 crc kubenswrapper[4898]: E0313 15:32:19.895311 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d" Mar 13 15:32:30 crc kubenswrapper[4898]: I0313 15:32:30.739554 4898 scope.go:117] "RemoveContainer" containerID="cee6ecdff3780e9dd70ed8f417a9b0b2ef52de112ea7b7ca374180dcff44bba7" Mar 13 15:32:30 crc kubenswrapper[4898]: E0313 15:32:30.740497 4898 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-8k6xj_openshift-machine-config-operator(767eecef-3bc9-4db4-a0cb-5d9c8554c62d)\"" pod="openshift-machine-config-operator/machine-config-daemon-8k6xj" podUID="767eecef-3bc9-4db4-a0cb-5d9c8554c62d"